Shortly after Chris Boyd was announced as the new managing director of EDI Global, I had the opportunity to talk with him about his career, his passion for international development, and his vision for the future of monitoring, evaluation, and learning (MEL) practices. The following transcript of our conversation has been edited for clarity and conciseness.
How did you get involved in international development?
When I graduated from university I wasn't sure of what I wanted to do. After a few years of working and saving money I went backpacking through Southeast Asia, Australia, and the South Pacific. I remember being in Fiji and some people didn’t have running water; they were getting water delivered by truck. The people in the streets were using any container they could find to hold water—pots, pans, buckets. I had what some might describe as an epiphany at that point and made the decision that I wanted a career where I could contribute to improving people's lives. When I came back to Scotland, I got involved in economic development by working for a monitoring, evaluation, and research consultancy that did most of its work in the UK. And that’s how I got my start in the field.
At the same time, I also wanted to work internationally and make a difference in developing countries, so I was always on the lookout for opportunities which would fulfil this ambition. Eventually I was offered a spot on a project in Afghanistan with a firm that specialized in delivering projects in fragile and conflict-affected states. Even though the opportunity frightened me at first, I felt that I had to take it if I was to get a start in international development. So I decided to take a risk and see where it led. That first overseas assignment taught me a lesson about how our perceptions of conflict-affected countries are often wrong and based only on the limited information we get through the news. In Afghanistan, I found most people were just like me, trying to get on with their lives and make a better future for themselves and their families, though under more difficult circumstances.
What drove you to the monitoring and evaluation side of development?
I got involved in monitoring and evaluation work as a result of chance and interest. When I started as a consultant in the UK working in Glasgow, my company managed a lot of monitoring and evaluation projects, mainly for the UK regional development agencies. I started working with colleagues who were experienced in the field and was lucky to find great mentors in the company who helped me to develop. As I got more experienced, I found I really enjoyed the work and could see myself making a career in the field. But it was always my goal to take what I learned and apply it internationally.
When I first went to Afghanistan, I remember saying to my wife, “You know, if I only last a week out there, I've still achieved my dream because I got to work on an international project and to travel to do it.” Looking back, I was never going to be satisfied with that, but at the time I was like “Great, I've done it!”
Why are measurement, evaluation, and learning practices important in international development?
Strong MEL practices are absolutely fundamental to effective international development. I’ve encountered far too many projects that were designed on the basis of weak evidence where not enough was known about the background circumstances, the political economy, and market dynamics of a situation to design interventions that have a decent chance of achieving their goals. Doing thorough research at the design stage is essential, including drawing on learning and evidence from similar projects with similar goals.
Secondly, once we start implementing a project, it’s really hard to see what’s ahead. Complex projects sometimes don’t know where programming is going in the future, because the situation is often dynamic and uncertain. This means implementers need to respond to opportunities and circumstances as they happen in near real-time. In my experience, the only way to do this effectively is by having strong feedback loops; good information comes from the field. Data informs strategic decisions about the direction of the program—it’s evidence-based management. The most successful programs have systems in place to obtain measurement data. Without it, you have no clear idea on how the intervention is performing and no solid basis on which to make decisions. There are still too many projects that don’t place enough of a premium on data-driven decision making, and as a result they often stumble around in the dark without the information they need to maximize their potential for success. The more we can embed the practice of using data for strategic decision making, the more value programs can deliver for funders; most importantly, the outcomes we deliver for the beneficiaries of those interventions will be improved.
Getting access to high-quality data is critical, but how an organization structures itself to use that data, to reflect on it and to make strategic decisions, is equally important. And to be honest, that’s where we’re still just getting started in international development—right now, the chief metrics for many implementing organizations are still related to fundraising and spend rates.
How has the role of monitoring and evaluation changed over the course of your career?
Monitoring and evaluation has definitely gained in importance. The big donors are spending more on it now than ever, so a whole profession and industry has grown up around MEL. The techniques we use have evolved; the software used to manage large data sets has evolved.
But if you look at the big projects, where millions of dollars might be spent on implementing an agriculture, education, or health program, there’s still some way to go. The MEL expertise is out there, but too often monitoring and evaluation is seen as separate and distinct from implementation. So what you end up with is a situation where the implementer and the MEL providers are hired separately. Sometimes this works, but often the interplay between the two is not as good as it should be, leading to issues with the use of data to manage the program and learn lessons.
Mathematica and EDI Global both pride themselves on having a combination of technical experience, field experience, and knowledge of policies and programs. How do you think that combination of experience positions an organization for the future of evidence-based development?
If we want to have the kind of impact at scale that makes international development work so valuable, we need more than technical experience and an understanding of theory. Producing peer-reviewed reports will only get you so far. You need to also understand the local context, the political economy, and how to get things done on the ground. This understanding is what makes the EDI Global and Mathematica partnership so powerful and positions us so well in the research field.
Mathematica brings high-level skills and expertise and a 50-year record of delivering high-quality research for clients around the world. EDI Global is an East African business with its roots in Tanzania and an expanding footprint in Uganda and Kenya that has built a solid reputation for high-quality data collection across the region. These complementary strengths offer a strong proposition for our current and potential partners in East Africa.
What people may not know about EDI Global is that most of our employees are East African. Developing their skills, their capacity, and their expertise is the right thing to do for the development of our region and company growth. That local and regional expertise is what many of our clients are looking for. There’s still a role for international expertise, but it must be coupled with a strong local presence and local knowledge.
What is the value of you continuing to work in Nairobi and of having a presence in East Africa?
First, being in Nairobi is essential for really understanding what’s happening on the ground in East Africa. It’s very hard to have a clear picture if you’re not in the region. There is broad international donor and agency representation in Nairobi and many of our clients are here. So having a footprint is important in terms of being close to our partners and the organizations we work for.
Second, Nairobi is a great place from which to service our business in East Africa and the rest of the continent. We're right on the equator and I can pretty much get anywhere in Africa quite easily, so I can be on the ground quickly to meet with partners or oversee projects when required.
Most importantly though—when I’m in the same region and time zone as our local teams in Tanzania, Uganda, and Kenya, it sends a strong signal that we are as committed to Africa as they are. Being in Nairobi also gives me better access to the available talent in our region—talent that is essential to growing a successful East African business.
On a personal note, my family and I have made Nairobi our home for the past five years. I have a stake in ensuring that the region continues to grow and prosper because this is where we have chosen to be.
What does the future of evidence-based development look like to you and how do organizations like EDI and Mathematica fit into that future?
We’re already seeing the influence of technology and that will only grow. Advances in satellite technology, the proliferation of smartphones across the continent, and the use of social media platforms and SMS are changing the way we collect data. In the next decade, thinking about innovative ways we can use these technologies to get better, faster information which can be used to improve people’s lives will be key. That doesn’t mean we won’t continue to send people to the field to get data, but the tools we use to get that data will change. For example, in thinking about sampling, we already see the benefits of drone and satellite technology in terms of helping to decide who we’re going to focus on when we get to a survey location.
Obviously, there’s a lot of talk about data science and big data as well. We always need to be aware of how we can apply data science to the data sets we collect and that are available to us. I’m not a data scientist, but I understand how the field can help us gain insights that were previously unattainable or very time consuming to unearth. When we have partners that are spending hundreds of millions of dollars of taxpayers’ money on development projects, anything we can do to provide better data and evidence helps us more efficiently address the world’s development challenges.
Everything starts with quality data. The problem is that in much of the developing world, there’s a lot of low-quality data and decisions are often made based on information that has been poorly collected, is biased, or has a political slant. Mathematica and EDI Global are at the forefront of this quest for high-quality, truthful data, and we want to stay there.
EDI is celebrating its twentieth anniversary in 2022, so there’s a lot to look back on and be proud of, but looking out another decade what does the future of MEL look like?
Plain and simple— I’d like to see chiefs of party, executive directors, CEOs, or other leaders of large implementation organizations understand that the key metric they are measured by is the development results that they achieve. Those results should be the first thing they think about when implementing a project. If MEL and research can support this change in mindset by providing high-quality data and evidence, then the field will have come of age. That will be a hard bar to clear but having more people in charge of organizations who have an appreciation of the value of high-quality monitoring, evaluation, and research, and who understand that the primary reason we do this work is to achieve outcomes that change people’s lives, will make a big difference. If MEL can be used to bring improved data to the table, and there’s greater commitment from international development leaders and practitioners to use evidence in decision making, the field will be in a better place.