Ranking is required for many real applications, such as search, personalisation, recommendation, and filtering. Recent research has focused on developing reliable ranking algorithms that maintain fairness in their outcomes. However, only a few consider multiple protected groups since this extension introduces significant challenges. While useful in the research sector, considering only one binary sensitive feature for handling fairness is inappropriate when the algorithm must be deployed responsibly in real-world applications. Our work is built on top of Multinomial FA*IR, a Fair Top-k ranking with multiple protected groups, which we extend to provide users the option to balance fairness and utility, adapting the final ranking accordingly. Our experimental results show that alternative better solutions overlooked by Multinomial FA*IR may be found through our approach without violating fairness boundaries. The code of the implemented solution and the experiments are publicly available to the community as a GitHub repository.

Alimonda, N., Castelnovo, A., Crupi, R., Mercorio, F., Mezzanzanica, M. (2023). Preserving Utility in Fair Top-k Ranking with Intersectional Bias. In Advances in Bias and Fairness in Information Retrieval 4th International Workshop, BIAS 2023, Dublin, Ireland, April 2, 2023, Revised Selected Papers (pp.59-73). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-37249-0_5].

Preserving Utility in Fair Top-k Ranking with Intersectional Bias

Castelnovo, Alessandro
;
Mercorio, Fabio;Mezzanzanica, Mario
2023

Abstract

Ranking is required for many real applications, such as search, personalisation, recommendation, and filtering. Recent research has focused on developing reliable ranking algorithms that maintain fairness in their outcomes. However, only a few consider multiple protected groups since this extension introduces significant challenges. While useful in the research sector, considering only one binary sensitive feature for handling fairness is inappropriate when the algorithm must be deployed responsibly in real-world applications. Our work is built on top of Multinomial FA*IR, a Fair Top-k ranking with multiple protected groups, which we extend to provide users the option to balance fairness and utility, adapting the final ranking accordingly. Our experimental results show that alternative better solutions overlooked by Multinomial FA*IR may be found through our approach without violating fairness boundaries. The code of the implemented solution and the experiments are publicly available to the community as a GitHub repository.
paper
Fair ranking system; Intersectional Bias; Post-processing Fairness Mitigation
English
4th International Workshop on Algorithmic Bias in Search and Recommendation, BIAS 2023, held as part of the 45th European Conference on Information Retrieval, ECIR 2023 - 2 April 2023 through 2 April 2023
2023
Boratto, L; Faralli, S; Marras, M; Stilo, G
Advances in Bias and Fairness in Information Retrieval 4th International Workshop, BIAS 2023, Dublin, Ireland, April 2, 2023, Revised Selected Papers
9783031372483
2023
1840 CCIS
59
73
none
Alimonda, N., Castelnovo, A., Crupi, R., Mercorio, F., Mezzanzanica, M. (2023). Preserving Utility in Fair Top-k Ranking with Intersectional Bias. In Advances in Bias and Fairness in Information Retrieval 4th International Workshop, BIAS 2023, Dublin, Ireland, April 2, 2023, Revised Selected Papers (pp.59-73). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-37249-0_5].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/430998
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
Social impact