A ‘Must Read’ Paper on the Reproducibility of the Results of Science, Published in Science

The authors of this paper [1] are constituted of an organisation called the ‘Open Science Collaboration’. They examined the reproducibility of studies in the field of psychology. They did this by the laborious process of replicating each of the studies.

One hundred replications were performed. You can see why this paper impressed even the jaded palate of the CLAHRC WM Director. In each case the protocol for the replication study was reviewed by, among others, the authors of the original study. The replication protocol was made publically available. The sampling frame for the original studies was based on three high-impact psychology journals in 2008. All studies were audited and all statistical analyses were repeated independently.

There is no agreed standard for evaluating replication success in replication studies. The authors used different methods – statistical significance, effect sizes, and subjective assessments of replication teams. The different methods correlated with each other (r=0.22 to 0.96). Effect sizes were transformed to correlation coefficients to make them reproducible.

Hold onto your seats! The mean effect size of the replication effects was half the magnitude of the mean effect size of the mean effects of the original studies. The direction of effect was reversed in a staggering 25% of original/replication dyads. Ninety-seven percent of original studies had results with P<0.05, dropping to only 36% in the replication set. Slightly less than half of the original point estimates lay within the 95% CIs of the replication estimate. Lastly, only 39% of the effects were deemed to have been replicated on subjective assessment.

These are provocative findings. But it is lazy to conclude that scientific results are nonsense and all is lost. Rather, like stories of corruption in politics, the CLAHRC WM Director takes heart that the psychology community takes their work so seriously that they have conducted this amazing study. He wonders whether a similar study is possible in qualitative research, and, if so, what it might find? Meanwhile replication is pretty much routine in clinical research in the form of assessment of heterogeneity in meta-analyses.

— Richard Lilford, CLAHRC WM Director


  1. Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015; 349(6251): aac4716.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s