Program evaluations are a critical part of any development program. As evaluators, we structure our research questions and methodologies to uncover the why and how of a program’s impact. These questions—and the insights we derive from them—enable our clients and partners to shape their work in ways that lead to larger improvements and ultimately greater equity.
But programs (and evaluations) don’t always go as planned. Just as our clients and partners pivot their programs when facing social or political changes, we approach evaluations with an agile mindset that enables us to find meaningful evidence in even the most complex situations.
Evaluations in a pandemic: Accounting for COVID-19
Few situations have been more complex than the COVID-19 pandemic. We still feel the ramifications of this global crisis today—not least in the global education sector. This poses an important question: how do we make the most of research ongoing during the COVID-19 pandemic?
The Millennium Challenge Corporation’s (MCC’s) Éxito Escolar program in Guatemala introduced new training for teachers, strengthening their skills and teaching methods to increase student engagement and outcomes in secondary education. As the project’s teacher training entered its third and final year, COVID-19 caused school closures and forced the teacher training program online. Éxito Escolar pivoted to continue its work during the pandemic, changing its targets and strategies to account for the challenges the pandemic posed—and requiring us to rethink our methodologies for collecting and analyzing data.
For example, our evaluation used a randomized controlled trial based on clusters of schools. The randomized design would enable us to compare results between students and teachers at centers that participated in the program against those that did not, allowing us to attribute differences to the Éxito Escolar intervention. However, our initial plan to collect data at the end of the training program was delayed almost two years by the pandemic.
Moving the data collection back by nearly two years had three major implications for the evaluation:
- During these years, students spent little time in the classroom with their teachers because of pandemic-related school closures, limiting students’ potential exposure to their teachers’ newly acquired skills.
- Some teachers who participated in the training left their schools, decreasing trained teachers’ potential impacts in the schools in our study.
- We had no baseline data on the students with whom we collected endline data.
Ultimately, the quantitative data didn’t show significant causal impact on student learning. However, we also could not rule out the possibility that training improved teachers’ skills enough to boost student learning. The limitations to our quantitative data and the conclusions we could draw from them meant that our implementation study and qualitative data from stakeholder interviews, focus groups, and other sources became more important in our efforts to understand the program’s contributions to teachers’ skills and student learning.
Most teachers completed the teacher training program, despite its demanding schedule and COVID-related challenges because they valued the skills they gained during the program. Teachers reported putting into practice the practical skills they learned in the teacher training program and felt strongly that their students were more engaged and learning more as a result right up until schools closed in 2020.
We also captured important results that related specifically to the pandemic disruptions:
- Student dropout rates increased during this time and students who were still enrolled saw learning loss, which impaired our ability to identify or attribute positive impacts from Éxito Escolar interventions. The COVID-19 disruptions occurred during the third year of the project, when many stakeholders anticipated seeing initial results—so quantitative data wouldn’t give us a complete picture.
- The training offered to teachers was initially provided both in person and online. When school closures occurred in 2020, the program quickly shifted to fully online, which may have reduced potential program dropout. In fact, more than half of the participating teachers noted that this shift to online training helped balance demands on their time between training and their other duties (including instructional time).
- However, school closures also meant teachers did not have as many opportunities to practice new teaching methods or skills, which could affect long-term retention of these skills.
Although COVID-19 caused disruptions on an unprecedented scale, programs adapt and pivot for many reasons. These pivots often result in data collection challenges but can also uncover important findings.
The importance of champions
Another component of MCC’s work in Guatemala looked at strengthening the institutional and operational components of secondary education. MCC supported the development of information systems that could inform planning and budgeting, teacher recruitment, and other important policy reforms. However, toward the end of implementation, a change in administration resulted in new partners within the government.
Administration changes happen frequently in most contexts, and this can be a challenge when the political will developed with previous administrations does not carry through. The new administration might be as committed to the reform process as the previous government was, but developing new relationships and trust takes time and can slow implementation. Such a transition also requires identifying new champions to move the work forward.
This experience shows how unanticipated events can throw an evaluation off course … and how to get back on track
Conducting evaluations in complex, dynamic situations requires high levels of creativity and rigor, but if done well, these evaluations can provide deep insights that can inform global progress for decades to come.
In the case of our evaluation in Guatemala, using a mixed-methods approach contributed to the evaluation’s resilience. When the pandemic-related school closures affected our timeline and limited what we could learn from our randomized evaluation, our qualitative data and analysis of our survey data provided rich information on teachers’ experiences with the training program and the ways in which it influenced their teaching. For example, teachers reported that the new teaching techniques they learned supported more active student participation in the classroom. Teachers also valued opportunities to share and learn from fellow teachers through the program’s learning communities.
Changes within government are more predictable than pandemics; many countries hold elections every four years, including Guatemala. However, it can be hard to know exactly how these changes will affect project implementation and evaluations. Developing relationships with technical staff within government departments and documenting planned data collection can help future-proof evaluations during these periods of change.
Regardless of whether we’re facing a pandemic or a cyclical administration change, evaluators benefit from staying flexible and responsive as the context changes, building strong relationships with a wide range of stakeholders, and leaning on the available evidence. This is exactly what we did in Guatemala, and we continue to use these strategies to improve our work—even when everything goes right.