Methodological transparency is one of the key elements for research replicability. Our project aims to assess a large number of published studies to evaluate their transparency and investigate how reporting of scientific methodologies has changed over the last decade. After conducting a pilot, we are currently setting up an extended crowdsourcing study to widen the scope of our research. In the pilot, we evaluate 180 experimental studies published in five social and general psychology journals, and related methodological transparency to basic publication characteristics (i.e., area, year, and open access policy). Results indicate that transparency has increased over the years and tends to be greater in articles from journal with a strong accessibility policy. Our plan is now to involve a large network of collaborators to analyse correlational as well as experimental studies from a broader range of psychological areas and to consider additional publication and journal features. We will discuss the challenges we have encountered in setting up our first crowdsourced research, the solutions we found and some issues yet to solve.

Conte, F., Facchin, A., Giaquinto, F., Rizzi, E., Vezzoli, M., Zogmaister, C. (2023). Meta-research for evaluating replicability in psychological science: a roadmap to a crowdsourcing project. Intervento presentato a: XXIX Conferenza dell'Associazione Italiana di Psicologia (AIP), Lucca, Italia.

Meta-research for evaluating replicability in psychological science: a roadmap to a crowdsourcing project

Federica Conte
;
Alessio Facchin;Ezia Rizzi;Michela Vezzoli;Cristina Zogmaister
2023

Abstract

Methodological transparency is one of the key elements for research replicability. Our project aims to assess a large number of published studies to evaluate their transparency and investigate how reporting of scientific methodologies has changed over the last decade. After conducting a pilot, we are currently setting up an extended crowdsourcing study to widen the scope of our research. In the pilot, we evaluate 180 experimental studies published in five social and general psychology journals, and related methodological transparency to basic publication characteristics (i.e., area, year, and open access policy). Results indicate that transparency has increased over the years and tends to be greater in articles from journal with a strong accessibility policy. Our plan is now to involve a large network of collaborators to analyse correlational as well as experimental studies from a broader range of psychological areas and to consider additional publication and journal features. We will discuss the challenges we have encountered in setting up our first crowdsourced research, the solutions we found and some issues yet to solve.
abstract + slide
methodology, open research, replicability, crowdsourcing study
English
XXIX Conferenza dell'Associazione Italiana di Psicologia (AIP)
2023
2023
open
Conte, F., Facchin, A., Giaquinto, F., Rizzi, E., Vezzoli, M., Zogmaister, C. (2023). Meta-research for evaluating replicability in psychological science: a roadmap to a crowdsourcing project. Intervento presentato a: XXIX Conferenza dell'Associazione Italiana di Psicologia (AIP), Lucca, Italia.
File in questo prodotto:
File Dimensione Formato  
Conte-2023-AIP.pdf

accesso aperto

Descrizione: Intervento a convegno - presentazione
Tipologia di allegato: Other attachments
Licenza: Creative Commons
Dimensione 8.24 MB
Formato Adobe PDF
8.24 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/467661
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact