Recently, crowdsourcing has been proposed as a tool for fighting misinformation online. Will internet users listen to crowdsourced fact-checking, and how? In this experiment we test how participants follow others’ opinions to evaluate the validity of a science-themed Facebook post and examine which factors mediate the use of this information. Participants observed a post presenting either scientific information or misinformation, along with a graphical summary of previous participants’ judgements. Even though most participants reported not having used information from previous raters, their responses were influenced by previous assessments. This happened regardless of whether prior judgements were accurate or misleading. Presenting crowdsourced fact-checking however did not translate into the blind copying of the majority response. Rather, participants tended to use this social information as a cue to guide their response, while also relying on individual evaluation and research for extra information. These results highlight the role of individual reasoning when evaluating online information, while pointing to the potential benefit of crowd-sourcing-based solutions in making online users more resilient to misinformation.

Panizza, F., Ronzani, P., Morisseau, T., Mattavelli, S., Martini, C. (2023). How do online users respond to crowdsourced fact-checking?. HUMANITIES AND SOCIAL SCIENCES COMMUNICATIONS, 10(1) [10.1057/s41599-023-02329-y].

How do online users respond to crowdsourced fact-checking?

Mattavelli, S;
2023

Abstract

Recently, crowdsourcing has been proposed as a tool for fighting misinformation online. Will internet users listen to crowdsourced fact-checking, and how? In this experiment we test how participants follow others’ opinions to evaluate the validity of a science-themed Facebook post and examine which factors mediate the use of this information. Participants observed a post presenting either scientific information or misinformation, along with a graphical summary of previous participants’ judgements. Even though most participants reported not having used information from previous raters, their responses were influenced by previous assessments. This happened regardless of whether prior judgements were accurate or misleading. Presenting crowdsourced fact-checking however did not translate into the blind copying of the majority response. Rather, participants tended to use this social information as a cue to guide their response, while also relying on individual evaluation and research for extra information. These results highlight the role of individual reasoning when evaluating online information, while pointing to the potential benefit of crowd-sourcing-based solutions in making online users more resilient to misinformation.
Articolo in rivista - Articolo scientifico
misinformation, truth, crowdsourcing
English
25-nov-2023
2023
10
1
867
open
Panizza, F., Ronzani, P., Morisseau, T., Mattavelli, S., Martini, C. (2023). How do online users respond to crowdsourced fact-checking?. HUMANITIES AND SOCIAL SCIENCES COMMUNICATIONS, 10(1) [10.1057/s41599-023-02329-y].
File in questo prodotto:
File Dimensione Formato  
Panizza-2023-Humanit Soc Sci Commun-VoR .pdf

accesso aperto

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Creative Commons
Dimensione 1.16 MB
Formato Adobe PDF
1.16 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/453379
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
Social impact