Main content
Files | Discussion Wiki | Discussion | Discussion
default Loading...

Home

Menu

Loading wiki pages...

View
Wiki Version:
## Estimating the Reproducibility of Psychological Science **Open Science Collaboration** **Abstract:** Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects (Mr = .197, SD = .257) were half the magnitude of original effects (Mr = .403, SD = .188), representing a substantial decline. Ninety-seven percent of original studies had significant results (p < .05). Thirty-six percent of replications had significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and, if no bias in original results is assumed, combining original and replication results left 68% with significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. **Citation:** Open Science Collaboration. (2015). [Estimating the reproducibility of psychological science][1]. *Science, 349*(6251), aac4716. Doi: 10.1126/science.aac4716 ### Contents [Summary Report][2]: Read the [*Science* article][3] and supplementary material summarizing the results of the Reproducibility Project: Psychology. Or, read the [Green OA version with supplementary information][4] in the same file. [Supplement only.][5] Supplementary materials to "Estimating the Reproducibility of Psychological Science." Includes additional graphs and details on analyses. [Replicated Studies][6]: Explore the preregistrations, materials, data, and result reports of the individual replication projects. [Guide to Analyses][7]: Reproduce the analyses of the individual projects and the aggregate results. [RPP Process][8]: Learn more about the design, management, and operation of this large-scale crowdsourced project. [Presentations][10]: Find articles, slides, notes, and videos of presentations of the Reproducibility Project: Psychology and related efforts. ### Comments [Technical Comment by Gilbert et al. (2016)][20]: Read the technical comment written by Gilbert, King, Pettigrew, and Wilson on the Reproducibility Project: Psychology. [Response to Gilbert et al. (2016)][21]: Read the response to Gilbert et al.'s technical comment, written by members of the Open Science Collaboration. [Replicated Studies mentioned in Gilbert et al. (2016)][22]: Explore the final reports and data for individual replications mentioned in Gilbert et al.'s technical comment. [All Comments][23]: Read additional comments on the publication and responses made by members of the Open Science Collaboration. ### Additional Resources [Center for Open Science][11]: Learn more about the organization that facilitated the project and its initiatives to increase the transparency and reproducibility of research. [Open Science Framework][12]: Learn more and get started using the free, open-source Open Science Framework for your own project management, archiving, manuscript sharing, and research registration. [TOP Guidelines][13]: The Transparency and Openness Promotion Guidelines are a collective effort to improve transparency and reproducibility across disciplines. ### About the Project The Reproducibility Project: Psychology began in November 2011, finished primary data collection in December 2014, and published a summary of the results in August 2015. The project was coordinated by the [Center for Open Science][14]. Replication teams followed a research protocol and received logistical assistance as they collected materials, identified the key finding for replication, ran their experiment, conducted analyses, and reported their findings. As stated in an initial report from 2012, "The Reproducibility Project uses an open methodology to test the reproducibility of psychological science. It also models procedures designed to simplify and improve reproducibility" ([Open Science Collaboration, 2012][15]). To that end, all project materials, data, and findings are posted on the [Open Science Framework][16], a free service of the [Center for Open Science][17]. Moreover, the project models reproducibility by making it easy to [reproduce the analyses][18] of each individual project, and the results of the aggregate report. As the first in-depth exploration of its kind, the project results provide insight into reproducibility and its correlates. With a large, open dataset, many additional research questions can be investigated. The project was designed to be a collaborative endeavor. Ultimately over 270 contributors earned authorship on the summary report and 86 others provided volunteer support. Replication teams designed, ran, and reported their replication studies. Brian Nosek, Johanna Cohoon, and Mallory Kidwell provided project coordination. Marcel van Assen, Chris Hartgerink, and Robbie van Aert led the analysis of results, Fred Hasselman generated the figures, and Sacha Epskamp led the analysis audit. Scores of additional volunteers assisted with coding of articles, analyses, and administrative tasks. Since its inception, other similar initiatives have begun in other scientific domains. The Center for Open Science coordinates one of these such efforts, the [Reproducibility Project: Cancer Biology][19]. Questions about the project can be directed to rpp@cos.io. [1]: http://www.sciencemag.org/cgi/content/full/349/6251/aac4716?ijkey=1xgFoCnpLswpk&keytype=ref&siteid=sci [2]: http://www.sciencemag.org/cgi/content/full/349/6251/aac4716?ijkey=1xgFoCnpLswpk&keytype=ref&siteid=sci [3]: http://www.sciencemag.org/cgi/content/full/349/6251/aac4716?ijkey=1xgFoCnpLswpk&keytype=ref&siteid=sci [4]: https://osf.io/phtye/ [5]: https://osf.io/k9rnd/ [6]: https://osf.io/ezcuj/wiki/Replicated%20Studies/ [7]: https://osf.io/ytpuq/wiki/home/ [8]: https://osf.io/mczxi/wiki/home/ [10]: https://osf.io/qjab5/wiki/home/ [11]: http://cos.io/ [12]: https://osf.io/ [13]: http://cos.io/top [14]: http://cos.io/ [15]: http://pps.sagepub.com/content/7/6/657.abstract [16]: https://osf.io/ [17]: http://cos.io/ [18]: https://osf.io/ytpuq/wiki/home/ [19]: https://osf.io/e81xl/wiki/home/ [20]: http://science.sciencemag.org/content/351/6277/1037.2 [21]: http://science.sciencemag.org/content/351/6277/1037.3.full [22]: https://osf.io/g28rt/wiki/Replicated%20Studies%20Mentioned%20in%20Gilbert%20et%20al.%20%282016%29/ [23]: https://osf.io/g28rt/
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.