Can Research Design Explain Variation in Head Start Research Results? A Meta-Analysis of Cognitive and Achievement Outcomes

Hilary M. Shager, Holly S. Schindler, Katherine A. Magnuson, Greg J. Duncan, Hirokazu Yoshikawa, Cassandra M D Hart

Research output: Contribution to journalArticle

Abstract

This study explores the extent to which differences in research design explain variation in Head Start program impacts. We employ meta-analytic techniques to predict effect sizes for cognitive and achievement outcomes as a function of the type and rigor of research design, quality and type of outcome measure, activity level of control group, and attrition. Across program evaluations, the average program-level effect size was 0.27 standard deviations. About 41% of the variation in estimates across evaluations can be explained by research design features, including the extent to which the control group experienced other forms of early care or education, and 11% of the variation within programs can be explained by the quality and type of the outcome measures.

Original languageEnglish (US)
Pages (from-to)76-95
Number of pages20
JournalEducational Evaluation and Policy Analysis
Volume35
Issue number1
DOIs
StatePublished - Mar 2013

Fingerprint

research results
research planning
evaluation
Group
education

Keywords

  • Head Start
  • meta-analysis
  • program evaluation

ASJC Scopus subject areas

  • Education

Cite this

Can Research Design Explain Variation in Head Start Research Results? A Meta-Analysis of Cognitive and Achievement Outcomes. / Shager, Hilary M.; Schindler, Holly S.; Magnuson, Katherine A.; Duncan, Greg J.; Yoshikawa, Hirokazu; Hart, Cassandra M D.

In: Educational Evaluation and Policy Analysis, Vol. 35, No. 1, 03.2013, p. 76-95.

Research output: Contribution to journalArticle

Shager, Hilary M. ; Schindler, Holly S. ; Magnuson, Katherine A. ; Duncan, Greg J. ; Yoshikawa, Hirokazu ; Hart, Cassandra M D. / Can Research Design Explain Variation in Head Start Research Results? A Meta-Analysis of Cognitive and Achievement Outcomes. In: Educational Evaluation and Policy Analysis. 2013 ; Vol. 35, No. 1. pp. 76-95.
@article{d97a5c34d0f24399bddf9aaefab753f7,
title = "Can Research Design Explain Variation in Head Start Research Results? A Meta-Analysis of Cognitive and Achievement Outcomes",
abstract = "This study explores the extent to which differences in research design explain variation in Head Start program impacts. We employ meta-analytic techniques to predict effect sizes for cognitive and achievement outcomes as a function of the type and rigor of research design, quality and type of outcome measure, activity level of control group, and attrition. Across program evaluations, the average program-level effect size was 0.27 standard deviations. About 41{\%} of the variation in estimates across evaluations can be explained by research design features, including the extent to which the control group experienced other forms of early care or education, and 11{\%} of the variation within programs can be explained by the quality and type of the outcome measures.",
keywords = "Head Start, meta-analysis, program evaluation",
author = "Shager, {Hilary M.} and Schindler, {Holly S.} and Magnuson, {Katherine A.} and Duncan, {Greg J.} and Hirokazu Yoshikawa and Hart, {Cassandra M D}",
year = "2013",
month = "3",
doi = "10.3102/0162373712462453",
language = "English (US)",
volume = "35",
pages = "76--95",
journal = "Educational Evaluation and Policy Analysis",
issn = "0162-3737",
publisher = "SAGE Publications Inc.",
number = "1",

}

TY - JOUR

T1 - Can Research Design Explain Variation in Head Start Research Results? A Meta-Analysis of Cognitive and Achievement Outcomes

AU - Shager, Hilary M.

AU - Schindler, Holly S.

AU - Magnuson, Katherine A.

AU - Duncan, Greg J.

AU - Yoshikawa, Hirokazu

AU - Hart, Cassandra M D

PY - 2013/3

Y1 - 2013/3

N2 - This study explores the extent to which differences in research design explain variation in Head Start program impacts. We employ meta-analytic techniques to predict effect sizes for cognitive and achievement outcomes as a function of the type and rigor of research design, quality and type of outcome measure, activity level of control group, and attrition. Across program evaluations, the average program-level effect size was 0.27 standard deviations. About 41% of the variation in estimates across evaluations can be explained by research design features, including the extent to which the control group experienced other forms of early care or education, and 11% of the variation within programs can be explained by the quality and type of the outcome measures.

AB - This study explores the extent to which differences in research design explain variation in Head Start program impacts. We employ meta-analytic techniques to predict effect sizes for cognitive and achievement outcomes as a function of the type and rigor of research design, quality and type of outcome measure, activity level of control group, and attrition. Across program evaluations, the average program-level effect size was 0.27 standard deviations. About 41% of the variation in estimates across evaluations can be explained by research design features, including the extent to which the control group experienced other forms of early care or education, and 11% of the variation within programs can be explained by the quality and type of the outcome measures.

KW - Head Start

KW - meta-analysis

KW - program evaluation

UR - http://www.scopus.com/inward/record.url?scp=84874213905&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84874213905&partnerID=8YFLogxK

U2 - 10.3102/0162373712462453

DO - 10.3102/0162373712462453

M3 - Article

AN - SCOPUS:84874213905

VL - 35

SP - 76

EP - 95

JO - Educational Evaluation and Policy Analysis

JF - Educational Evaluation and Policy Analysis

SN - 0162-3737

IS - 1

ER -