Replication Crisis

Replicability of Research

Terminology

Terms like “reproducibility” and “replicability” differ between fields and researchers (Voelkl et al. 2025; Baker 2016; Nosek and Errington 2020). Here, I use the terminology as depicted in Figure 1 from The Turing Way Community (2021). Thus, a study is replicable if new data sets obtained from replication studies yield results compatible with the original study.

Figure 1: Turing Way Community, CC-BY 4.0

How much research is replicable?

The left panel in Figure 2 shows the replication success rate of several large-scale replication studies, as determined by the authors’ criteria. It should be noted that replication success criteria are different between authors, and that different fields have unique challenges and are not directly comparable. Thus, this figure only serves as a high-level overview of replication success. The right panel shows the relative effect size (original effect size divided by replication effect size). Numbers smaller than 1 indicate that the replicated effect is smaller than the one reported in the original study.

Sources:

Prinz, Schlange, and Asadullah (2011); Begley and Ellis (2012); Klein et al. (2014); OPEN SCIENCE COLLABORATION (2015); Camerer et al. (2016); Klein et al. (2018); Camerer et al. (2018); Errington et al. (2021)

Preprints not yet included in the figure:

To read:

  • Brodeur, Mikola, and Cook (2024)

Is there a replication crisis?

Note

“If we define a crisis as something that’s changing for the worse, we don’t have the evidence to say that’s happening.”

Brian Nosek, executive director, Center for Open Science (Chemical & Engineering News)

References

Baker, Monya. 2016. “Muddled Meanings Hamper Efforts to Fix Reproducibility Crisis.” Nature, June. https://doi.org/10.1038/nature.2016.20076.
Begley, C. Glenn, and Lee M. Ellis. 2012. “Raise Standards for Preclinical Cancer Research.” Nature 483 (7391): 531–33. https://doi.org/10.1038/483531a.
Brodeur, Abel, Derek Mikola, and Nikolai Cook. 2024. “Mass Reproducibility and Replicability: A New Hope.” SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4790780.
Camerer, Colin F., Anna Dreber, Eskil Forsell, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, et al. 2016. “Evaluating Replicability of Laboratory Experiments in Economics.” Science 351 (6280): 1433–36. https://doi.org/10.1126/science.aaf0918.
Camerer, Colin F., Anna Dreber, Felix Holzmeister, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, et al. 2018. “Evaluating the Replicability of Social Science Experiments in Nature and Science Between 2010 and 2015.” Nature Human Behaviour 2 (9): 637–44. https://doi.org/10.1038/s41562-018-0399-z.
Errington, Timothy M, Maya Mathur, Courtney K Soderberg, Alexandria Denis, Nicole Perfito, Elizabeth Iorns, and Brian A Nosek. 2021. “Investigating the Replicability of Preclinical Cancer Biology.” eLife 10 (December): e71601. https://doi.org/10.7554/eLife.71601.
Klein, Richard A., Kate A. Ratliff, Michelangelo Vianello, Reginald B. Adams, Štěpán Bahník, Michael J. Bernstein, Konrad Bocian, et al. 2014. “Investigating Variation in Replicability.” Social Psychology 45 (3): 142–52. https://doi.org/10.1027/1864-9335/a000178.
Klein, Richard A., Michelangelo Vianello, Fred Hasselman, Byron G. Adams, Reginald B. Adams, Sinan Alper, Mark Aveyard, et al. 2018. “Many Labs 2: Investigating Variation in Replicability Across Samples and Settings.” Advances in Methods and Practices in Psychological Science 1 (4): 443–90. https://doi.org/10.1177/2515245918810225.
Nosek, Brian A., and Timothy M. Errington. 2020. “What Is Replication?” PLOS Biology 18 (3): e3000691. https://doi.org/10.1371/journal.pbio.3000691.
OPEN SCIENCE COLLABORATION. 2015. “Estimating the Reproducibility of Psychological Science.” Science 349 (6251): aac4716. https://doi.org/10.1126/science.aac4716.
Prinz, Florian, Thomas Schlange, and Khusru Asadullah. 2011. “Believe It or Not: How Much Can We Rely on Published Data on Potential Drug Targets?” Nature Reviews Drug Discovery 10 (9): 712–12. https://doi.org/10.1038/nrd3439-c1.
The Turing Way Community. 2021. “The Turing Way: A Handbook for Reproducible, Ethical and Collaborative Research.” Zenodo. https://doi.org/10.5281/zenodo.5671094.
Voelkl, Bernhard, Rachel Heyard, Daniele Fanelli, Kimberley Wever, Leonhard Held, Zacharias Maniadis, Sarah McCann, Stephanie Zellers, and Hanno Würbel. 2025. “A General Framework for Diagnosing and Addressing Reproducibility Problems,” April. https://osf.io/https://osf.io/nmwr3.