Putting Rigorous Evidence Within Reach: Lessons Learned from the New Heights Evaluation
This article uses an evaluation of New Heights, a school-based program for pregnant and parenting teens in the District of Columbia Public Schools, to illustrate how maternal and child health programs can obtain rigorous evaluations at reasonable cost using extant administrative data. The key purpose of the article is to draw out lessons learned about planning and conducting this type of evaluation, including the important role of partnerships between program staff and evaluators.
This article summarizes the evaluation’s research design, data sources, and lessons learned about ingredients contributing to the successful implementation of this study. The evaluation employed a difference-in-differences design to estimate program impacts using administrative data merged across agencies.
Several features of New Heights and its context facilitated an evaluation. First, New Heights leaders could clearly describe program components and how the program was expected to improve specific student education outcomes. These outcomes were easy to measure for program and comparison groups using administrative data, which agencies were willing to provide. Second, buy-in from program staff facilitated study approval, data agreements, and unanticipated opportunities to learn about program implementation. Finally, time spent by evaluators and program staff in conversation about the program’s components, context, and data resulted in greater understanding and a more useful evaluation.
The New Heights evaluation is a concrete example of how a small program with a modest evaluation budget can obtain evidence of impact. Collaborative relationships between researchers and program staff can enable these informative studies to flourish.
How do you apply evidence?
Take our quick four-question survey to help us curate evidence and insights that serve you.
Take our survey