Warning: This paper contains examples of language and images which may be offensive. In this paper, we address the problem of automatic misogynous meme recognition by dealing with potentially biased elements that could lead to unfair models. In particular, a bias estimation technique is used to identify those textual and visual elements that unintendedly affect the model prediction, and a few bias mitigation methods are proposed, investigating two different types of debiasing strategies, i.e., at training time and at inference time. The proposed approaches achieve remarkable results both in terms of prediction and generalization capabilities.

Balducci, G., Rizzi, G., Fersini, E. (2025). Misogynous Memes Recognition: Training vs Inference Bias Mitigation. IJCOL, 11 [10.17454/IJCOL111.05].

Misogynous Memes Recognition: Training vs Inference Bias Mitigation

Balducci, G;Rizzi, G;Fersini, E
2025

Abstract

Warning: This paper contains examples of language and images which may be offensive. In this paper, we address the problem of automatic misogynous meme recognition by dealing with potentially biased elements that could lead to unfair models. In particular, a bias estimation technique is used to identify those textual and visual elements that unintendedly affect the model prediction, and a few bias mitigation methods are proposed, investigating two different types of debiasing strategies, i.e., at training time and at inference time. The proposed approaches achieve remarkable results both in terms of prediction and generalization capabilities.
Articolo in rivista - Articolo scientifico
Bias Mitigation, Misogyny Identification, Meme
English
1-giu-2025
2025
11
open
Balducci, G., Rizzi, G., Fersini, E. (2025). Misogynous Memes Recognition: Training vs Inference Bias Mitigation. IJCOL, 11 [10.17454/IJCOL111.05].
File in questo prodotto:
File Dimensione Formato  
Balducci-2025-IJCOL-VoR.pdf

accesso aperto

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Creative Commons
Dimensione 430.99 kB
Formato Adobe PDF
430.99 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/572463
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact