Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis

Patrick Shrout, Joseph L. Rodgers

Research output: Contribution to journalReview article

Abstract

Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

Original languageEnglish (US)
Pages (from-to)487-510
Number of pages24
JournalAnnual Review of Psychology
Volume69
DOIs
StatePublished - Jan 4 2018

Fingerprint

Disclosure
Psychology
Research
Meta-Analysis
Research Personnel
Power (Psychology)

Keywords

  • Methodology
  • Replication
  • Statistics

ASJC Scopus subject areas

  • Psychology(all)

Cite this

Psychology, Science, and Knowledge Construction : Broadening Perspectives from the Replication Crisis. / Shrout, Patrick; Rodgers, Joseph L.

In: Annual Review of Psychology, Vol. 69, 04.01.2018, p. 487-510.

Research output: Contribution to journalReview article

@article{9ace82952b3b402fa250c75d254f6806,
title = "Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis",
abstract = "Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.",
keywords = "Methodology, Replication, Statistics",
author = "Patrick Shrout and Rodgers, {Joseph L.}",
year = "2018",
month = "1",
day = "4",
doi = "10.1146/annurev-psych-122216-011845",
language = "English (US)",
volume = "69",
pages = "487--510",
journal = "Annual Review of Psychology",
issn = "0066-4308",
publisher = "Annual Reviews Inc.",

}

TY - JOUR

T1 - Psychology, Science, and Knowledge Construction

T2 - Broadening Perspectives from the Replication Crisis

AU - Shrout, Patrick

AU - Rodgers, Joseph L.

PY - 2018/1/4

Y1 - 2018/1/4

N2 - Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

AB - Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

KW - Methodology

KW - Replication

KW - Statistics

UR - http://www.scopus.com/inward/record.url?scp=85040343391&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85040343391&partnerID=8YFLogxK

U2 - 10.1146/annurev-psych-122216-011845

DO - 10.1146/annurev-psych-122216-011845

M3 - Review article

C2 - 29300688

AN - SCOPUS:85040343391

VL - 69

SP - 487

EP - 510

JO - Annual Review of Psychology

JF - Annual Review of Psychology

SN - 0066-4308

ER -