How AmeriCorps Uses Data to Support Communities Through National Service and Volunteerism

How AmeriCorps Uses Data to Support Communities Through National Service and Volunteerism

Aug 28, 2024
On the Evidence logo and profile images of Stephanie Garippa, Diana Gioia, Mary Hyde, and Scott Richman

Thirty years ago, President Bill Clinton signed bipartisan legislation creating what is now known as AmeriCorps, a federal agency for national service and volunteerism. Since then, the agency estimates that more than 1.3 million AmeriCorps members and hundreds of thousands more AmeriCorps Seniors volunteers have provided billions of hours of service across each of the 50 states and U.S. territories.

AmeriCorps functions as a kind of Swiss Army knife for addressing challenges in our communities, with members helping to prepare today’s students for tomorrow’s jobs, lead conservation and climate change efforts, connect returning veterans to jobs, fight the opioid epidemic, support seniors to live independently, make college more accessible and affordable, and help Americans rebuild their lives following disasters. To understand the impacts of the agency’s wide range of investments, AmeriCorps relies on its Office of Research and Evaluation.

To commemorate the 30th anniversary of AmeriCorps, On the Evidence hosted a discussion on the role of data and research in helping AmeriCorps and its grantees deliver on their respective missions. The guests for this episode are Dr. Mary Hyde, Stephanie Garippa, Diana Gioia, and Scott Richman.

Hyde is the director of research and evaluation at AmeriCorps.

Garippa oversees the AmeriCorps program at Maggie’s Place, an Arizona-based nonprofit and AmeriCorps grantee serving mothers through housing, education, and direct support services.

Gioia is a former AmeriCorps member who works with Stephanie at Maggie’s Place, where she is the data and evaluation coordinator.

Richman is a senior researcher at Mathematica who co-authored a report for AmeriCorps synthesizing five years’ worth of studies describing the state of the evidence where the agency seeks to make an impact. The report is part of AmeriCorps LEARNS, an ongoing project where Mathematica helps AmeriCorps share the growing body of evidence on the impact of national service and resources for building that evidence.

On the episode, they discuss the important but complex task of measuring the impact of AmeriCorps, given that it seeks to not only spur higher levels of civic engagement and community service, but to simultaneously help its service members, partnering organizations, the communities in which both operate, and society as a whole. They talk about the kinds of evidence that AmeriCorps and grantees like Maggie’s Place collect, how that evidence is used to drive impact, and how the role of evidence in guiding AmeriCorps’ work has evolved over time.

Watch the full episode.

View transcript

[MARY HYDE]

 

I think that ultimately, this evidence is in service to creating better lives for people, and that's what my goal is, is to continue to do that with the hope that it is useful to those who most need it and most want to improve the conditions in their communities.

 

[J.B. WOGAN]

 

I’m J.B. Wogan from Mathematica and welcome back to On the Evidence.

Thirty years ago, President Bill Clinton signed bipartisan legislation creating what is now known as AmeriCorps, a federal agency for national service and volunteerism. Since then, the agency estimates that more than 1.3 million AmeriCorps members and hundreds of thousands more AmeriCorps Seniors volunteers have provided billions of hours of service across each of the 50 states and U.S. territories.

 

AmeriCorps functions as a kind of Swiss Army knife for addressing challenges in our communities, with members helping to …

 

prepare today’s students for tomorrow’s jobs,

 

lead conservation and climate change efforts,

 

connect returning veterans to jobs,

 

fight the opioid epidemic,

 

support seniors to live independently,

 

make college more accessible and affordable,

 

and help Americans rebuild their lives following disasters.

 

If you’re like me, you’ve probably heard of AmeriCorps before. Maybe you or someone you know was once an AmeriCorps member or AmeriCorps Seniors volunteer. What you might not know is that AmeriCorps has an Office of Research and Evaluation, which studies the impacts of the agency’s investments in national service and volunteerism.

 

On this episode of On the Evidence, we commemorate the 30th anniversary of AmeriCorps by discussing the role of data and research in helping both AmeriCorps and its grantees deliver on their respective missions. Our guests are Dr. Mary Hyde, Stephanie Garippa, Diana Gioia, and Scott Richman.

 

Mary is the director of research and evaluation at AmeriCorps.

 

Stephanie oversees the AmeriCorps program at Maggie’s Place, an Arizona-based nonprofit and AmeriCorps grantee serving mothers through housing, education, and direct support services.

 

Diana is a former AmeriCorps member who works with Stephanie and is the data and evaluation coordinator at Maggie’s Place.

 

Scott is a senior researcher at Mathematica who co-authored a report for AmeriCorps synthesizing five years’ worth of studies describing the state of the evidence where the agency seeks to make an impact.

 

On the episode, we discuss the important but complex task of measuring the impact of AmeriCorps, given that it seeks to not only spur higher levels of civic engagement and community service, but to simultaneously help its service members, partnering organizations, the communities in which both operate, and society as a whole. We talk about the kinds of evidence that AmeriCorps and grantees like Maggie’s Place collect, how that evidence is used to drive impact, and how the role of evidence in guiding AmeriCorps’ work has evolved over time.

 

If you’re new to the show, please considering subscribing. More information on how to subscribe on your podcasting app of choice is available at mathematica.org/ontheevidence.

 

I hope you enjoy the conversation.

[J.B. WOGAN]

 

All right. Well, Mary, we're going to start with you and, I'm hoping you can, you can do some table setting for us and for the listeners. I suspect many of our listeners have heard of AmeriCorps before and maybe have heard of a specific volunteer project, but if you could just, you know, give, give a lay of the land. What does AmeriCorps do? What is the mission of the agency?

 

[MARY HYDE]

 

Our mission is to improve lives, strengthen communities, and foster civic engagement through service and volunteering.

 

So, what does that mean? What does that look like? You mentioned that some folks might be familiar with our programs, but in general, if you want to participate in our programs, we have opportunities for people of all ages, and they can volunteer in one of our six results-based national service programs.

We have our AmeriCorps Seniors Programs, which are Foster Grandparents Program, the Senior Companion Program, and the RSVP Program, or you can volunteer in our AmeriCorps State National Program, our VISTA Program, or our National Civilian Community Corps. A key component of all of these programs is mobilizing volunteers.

So, if you're going to participate in any one of these programs, you're going to be involved in some way in organizing volunteers around different policy challenges or issues such as early literacy or numeracy, independent living or aging in place, disaster response and recovery, or access to viable career paths.

 

So those are just some examples of what we do and what our mission is and how we achieve it.

[J.B. WOGAN]

 

Okay. And I wanted to ask about how the mission has changed over time, or maybe not changed, but how different aspects of the mission have been emphasized by Congress and agency leadership since its inception back in 1993.

[MARY HYDE]

 

Sure, I, I think I could probably answer that best by talking a little bit about how the agency's evidence story has evolved because I think that reflects the different emphasis that has been placed on different aspects of our mission.

So what do I mean by that? I would say early on, we really kind of focused on the actual participant or the volunteer in the program. We focused on volunteering behaviors across the general population. So, it was kind of, if you think about a big funnel. We talked about, what does this look like in the general population at large? What does this mean for the actual volunteers in our national service program?

 

So we wanted to understand their experience. We wanted to understand their outcomes and we wanted to understand sort of the best practices around getting people to volunteer in their community around different issues.

 

That's sort of began to morph into more of the community focus. So if you recall, I talked about improving lives, but also strengthening communities. So, we began to evolve from thinking a lot about volunteering behavior in general, or the actual members and volunteers themselves, and really wanted to start focusing on the community impact of our programs.

And that shift sort of corresponded with the rise of program evaluation as a practice inside a federal government, as well as some other big policy trends within the federal government. And so that sort of meant a shift to looking at different interventions, different program models. So, not that we stopped paying attention to the volunteers and members themselves, but suddenly we were much more focused on specific programs like tutoring programs, college access and persistence programs, workforce development programs, peer recovery coach, youth mental health—very specific issue areas and programs tied to addressing those issue areas.

I think that was probably the last 20 years in a nutshell, sort of this different focus. Today, I think you will find that we are really trying to have a balanced approach to that. So, we have an impact framework that we've developed that really looks at all of those dimensions of our mission and all of those areas of results that we're trying to achieve.

So, we're looking at the participants, we're looking at our partners, we're looking at communities and we're looking at society. So, I think that's been the evolution over time since the agency was founded in 1993.

[J.B. WOGAN]

 

Okay. So, so we're, you've got participants, you've got the issue areas and you've got the specific interventions, or, or programs. And, so in terms of measuring performance, you, you lead, the part of the agency that has the, the, the task of actually measuring whether you're delivering on on those different performance areas, right?

So I was hoping you could explain, Mary, what is the Office of Research and Evaluation and what is its role in the broader context of AmeriCorps’ work?

 

[MARY HYDE]

 

Yeah, I think that our role, as you said, is, is we take a lead role in developing the evidence base for our programs and for our national service participants.

I think that we're responsible for working with our program colleagues, our business support offices, and particularly our chief data office to really think about the types of information that we need to make the case and to create insights into what might be working and what might need to be changed.

 

And what that means is we have to take a lot of different approaches to how we find evidence, how we build evidence. We rely on administrative data. We look at performance metrics, so that we can assess impact real time, if you will. So as programs are going along, as things are being implemented, what types of outcomes are we seeing as, as this work is being unfolded in, in communities. That relies on administrative data, that relies on performance measures that our partners submit to us.

 

We also use foundational research and program evaluation. Some of these questions require more of a social science approach. It requires comparative design, so that we're looking at this tutoring program relative to another tutoring program or absence of a tutoring program. So, we need to look at those types of outcomes in that way.

 

I think that we rely on survey research a lot. I spoke a little earlier about looking at these types of behaviors in the U.S. general population. We have a survey that we administer in partnership with the U.S. Census. We have an annual survey that we administer to our members. So we use a lot of different approaches and a lot of different strategies to generate what we think is the most relevant data and evidence for any particular question or any particular policy need.

 

[J.B. WOGAN]

 

I did want to ask one, one more follow-up question, Mary, which was, who, who do you conceive of as the audiences or the consumers of this, of your, your offices work. Is it Congress? Is it members of the public? Is it many different audiences? Who, who are you producing research for?

 

[MARY HYDE]

 

I would say all of the above, right? I mean, we have a lot of stakeholder groups. I mean, we are taxpayer dollars being used in communities. So we certainly, share this evidence and build this evidence for the purposes of demonstrating that these are good investments, this is, you know, giving the public the quality services they deserve. Certainly, Congress who has a hand in our appropriations. We certainly think of our program partners as key audiences for this information. Our hope and our goal is that we use this data and evidence to make continuous improvements in our strategies and our particular programs in the way in which we engage our members and volunteers.

 

So again, sort of going back to our framework and the dimensions of our mission, our hope is that at the end of the day, people take whatever evidence we're generating out of our office and other offices to use to make things better to improve and strengthen them or in some cases to pivot and not do something if it's showing to be ineffective.

 

[J.B. WOGAN]

 

Well, I heard you say program partners, and that's a perfect segue to talk to our friends here at Maggie’s Place. Stephanie, I was hoping you could explain what your organization Maggie’s Place is and how does AmeriCorps support your organization's work?

 

[STEPHANIE GARIPPA]

 

Absolutely. We are, a non-profit organization in Phoenix, Arizona, in the, the greater Phoenix area, focused on serving pregnant women who are experiencing homelessness.

 

We have five maternity homes, family-style homes across the valley and we welcome women who are, are pregnant, and as I said, experiencing homelessness, and they can stay until their baby is a year old. We have live-in AmeriCorps members, who live in our maternity homes and provide the front line of support for our moms and their babies.

 

Our AmeriCorps members span age ranges. They often are right out of college and looking to do a year of service. But we call our AmeriCorps members our secret sauce. They help us to, they enable us to have the model that we have, which is a, a small ratio, typically about six to seven mothers in a home with three to four AmeriCorps members.

 

And it's that day-to-day support, specifically that AmeriCorps are able to provide that enable our mothers to navigate the challenges of pregnancy and child birth and learning to care for their babies, as well as achieving goals that will help set them up for self-sufficiency when they leave our homes.

 

[J.B. WOGAN]

 

Okay. I, I want to be, I want to be named, part of the secret sauce of, of, of, of a place that I work. That's, that's pretty cool. And I did want to flag, I believe that you, that Maggie’s Place is part of what's known as the Healthy Futures portfolio or issue area for AmeriCorps. Mary, do you want to explain what Healthy Futures, what that is and how Maggie’s Place fits into that context?

 

[MARY HYDE]

 

Yeah, sure. Our legislation basically mandates the agency to focus on six specific issue or policy areas. One of them is on healthy outcomes for communities. There are five others, but this is one of our main focus areas. And so Maggie’s Place is focusing on moving community outcomes in the space of healthier lives for the residents and particularly unhoused moms.

 

[J.B. WOGAN]

 

Okay, awesome. Well, Scott. You, you've been waiting patiently, as we've been sort of laying the, the groundwork here. I, I wanted to ask what is the AmeriCorps LEARNS project and what role is Mathematica playing in supporting the Office of Research and Evaluation at AmeriCorps?

 

[SCOTT RICHMAN]

 

So, as Mary mentioned earlier, AmeriCorps produces a tremendous amount of research, and data and evidence every year, and it always continues to grow, and it also creates a lot of evaluation building, resources for organizations to use, for their own program.

 

And our role for the, for ORE, the Office of Research and Evaluation, is to help communicate all that information and those resources, to various audiences that Mary also mentioned earlier. So, we're talking about Congress, we're talking about program partners, we're talking about even internal AmeriCorps staff.

 

So, you know, with the amount of information and research and knowledge that's being developed, we don't want that information just to sit on a shelf and collect dust. So our, our goal and our role is to help communicate that information out. But also to make sure we're communicating it out in a way that it's being put to use and that it's drawing awareness to the research that's being done on civic engagement and national service, or just how to conduct an evaluation, that, these are great resources that AmeriCorps produces, and we want them to be in the hands of the folks who can use them, both to improve their evaluation building activities, but most importantly, to make a difference in their communities.

 

So, our goal is to, you know, to help AmeriCorps officer research and evaluation, put that, all that information out there, and put it to good use so that, the audiences are aware, the evidence that exists for national service, and also to help build their capacity to conduct evaluation. And we do that in a lot of, various different ways, which is really exciting and fun to do.

 

So we create, for example, a whole bunch of products, which can take the form of, infographics, fact sheets, newsletters. We create, webinars, we host, host a variety of other things, really any, way we can figure out how to get that information out there, that's what we want to do to make that knowledge known to everyone who can put it to use.

 

And then another thing that we do is we help ORE manage their Evidence Exchange, which is an online repository of research studies on the americorps.gov site. So as new information is, or new studies are being conducted each year, we help the ORE put that information out there on on the Evidence Exchange so that the new knowledge is known, and it can also be put to use.

 

So that's a few things that we do. And there's there's no shortage of activities that are needed to help communicate all the good work being done.

 

[J.B. WOGAN]

 

Okay, that's very cool. I'm a big fan of those kinds of online clearinghouses of evidence. I know there are a few others that Mathematica has worked on in the past or is currently working on, but it seems like a great resource when somebody has a question and they want to see what the research says.

If, has it, has any research been done on something? It's nice that there are resources like that, available to, to check online.

 

And I know we'll be talking a little bit later in this conversation about the State of the Evidence report that Mathematica did for AmeriCorps. So I'm looking forward to talking about that, too.

 

Mary, I want to turn back to you. So, broadly speaking, I'd love to hear a little bit more about how the agency measures its impacts, particularly given that there are these sort of these, there's multiple tiers of things that you're trying to accomplish in terms of, you know, civic participation, programs, interventions, participants.

 

So, maybe, maybe to start, has the process of measuring impacts changed over time? And if so, how?

 

[MARY HYDE]

 

Yeah, I can give you some concrete examples. So, some of the ways that we measure impacts for those who serve, members and volunteers who serve in our programs is, I think I mentioned, we have a member exit survey that we administer to everyone completing their year of service, but we supplement that with more longitudinal survey research where we, look over time, at least a year post service to see where they are in terms of either their educational pursuits or a particular career and whether or not they're still engaged in their community in any way.

 

So, for example, we just finished up a five-year, six-year study with our National Civilian Community Corps, or NCCC, where we followed members for two years to look at their outcomes a little bit post their year of service. We're doing something similar with the Public Health AmeriCorps initiative, where we are following those who serve in a Public Health AmeriCorps program over time, again, to see where they end up in their careers. And we are really looking at how to do that a bit more intentionally with no matter what program someone's serving in so that we're looking at how their experience was while in a program, immediately concluding their service, and then a year out, ideally, to see where they are.

 

In terms of our interventions and our, you know, community organizations that are partnering with us using members and volunteers to deliver services, we have program evaluation studies, and that usually comes in two ways.

 

In some cases, the organization itself is conducting an evaluation of its program and submitting those findings to us at headquarters, if you will. In other cases, we're sponsoring a set of evaluations, and I can give you an example. We're looking right now at what we call a bundled evaluation, and that takes different organizations who are looking at similar types of outcomes using similar types of strategy. We bundle them together, we design an evaluation, and we work closely to design that evaluation together, develop any instruments that may need to be developed, and then working with folks to understand how they can best assess the outcomes that they're trying to change basically. So we've got that in our healthy outcomes space. We've got that in our volunteer generation fund space. We're doing some of that work in our climate change space.

 

But the point is that, you know, in some cases we lead and sponsor a sort of national level evaluation where our partner organizations can participate. And it's sort of a multi-site evaluation. In other cases, depending on what program funds you're using, you may be required to conduct your own evaluation. So you're using either an internal evaluator or a third party to evaluate your intervention. You submit that information to us and we then try to compile it and look at it in a systematic way. I think Scott mentioned that.

 

And then in terms of, you know, what are we doing around the social capital in a community, right? I mean, there's this assumption that by mobilizing members and volunteers, by having organizations, you know, work with the community to move certain outcomes that you're creating some level of social capital.

 

It's a hard thing to measure directly, but we would say that this biennial survey that we do with the, with the Census, is one way of looking at trends in these types of behaviors over time, where we sort of assume that we are making contributions to that space, if you will. So different strategies that we use to try and look at all of these different levels of impact and, you know, to your earlier question, I would say in years past, we would do more, one strategy more than another.

 

I would say at this point in time, it's pretty balanced. Our office has basically two big portfolios of work. One tends to focus on our members and volunteers and sort of this type of behavior in the U.S. general population. The other tends to focus on program evaluation research, as well as capacity building efforts.

 

I would add also that we partner with academics across the country where we have a research grant program, and they help us look at more local types of civic engagement, community engagement, and what that social connection, social capital might look like at a more local level versus that national level survey that we do.

 

So multiple strategies, multiple partners, all needed to answer all of the questions that are tied to our mission.

 

[J.B. WOGAN]

 

Has the level of expectation around how sophisticated the research or data needs to be changed over time, too? Like was it was there a time when it was sort of process measures, and now, now you're trying to show causal impacts? Are there, are there higher levels of expectations around the kind of quality of data that you have to collect and use?

 

 

[MARY HYDE]

 

Yeah, absolutely. I would say that in the last decade or so, our agency, as well as other federal agencies, has really, moved in the direction of assuming that quality performance data is important. You want that output, you want that outcome data. Those are critical ways of assessing sort of real time progress and making sure you're on track.

 

But there is an increased expectation that you will have some causal evidence. So, for example, AmeriCorps invests almost half of its money in educational programs. We want to improve academic outcomes for young people across the country, K through 12 and some post-secondary. At some point, the expectation from Congress, from the public, from the White House has been, okay, well, what are your most effective programs? And by effective, we want you to use the most gold-standard, rigorous design available so that you can say, relative to other types of interventions, you're seeing these types of academic outcomes, and you can be pretty confident that it's because the programs you're funding are playing a role in that.

 

So, yes, I would say the bar has been set higher over the last decade or so. Our requirements of grantees are tied to levels of funding and the level of rigor that one expects. I think that at this moment in time, we are trying to balance that. So it's not an either or, but it's all of it. And what is the best type of data, depending on the question and the, I guess, I wouldn't want to say risk, but the the stakes of a decision, right?

 

Like if you're trying to tweak a program because you seem to be on track, but might need a little pivoting, that's one stake. If you're talking about the investment of, you know, half a billion dollars in a tutoring program, that's another stake. That's a different level of public dollars being invested.

So you want to have more confidence that what you're doing is the right strategy. So yeah, I think that the stakes have gotten higher, that the expectations have changed. But I would also say that at least at AmeriCorps, we're trying to balance and right-size and position everyone to be able to succeed and have some good data no matter where it is on that continuum to do the best decision-making they can.

 

[J.B. WOGAN]

 

Okay, Diana, I want to turn to you now. We heard about we just heard from Mary about the role of AmeriCorps Office of Research and Evaluation. You have a corresponding role at Maggie’s Place. So I'm curious, how do data and evaluation figure into your work at Maggie’s Place? Like, what kind of data do you collect? What are you evaluating and how are you doing that? And to what extent do you benefit from AmeriCorps’ work in making Maggie’s Place more data-driven and evidence-based?

 

[DIANA GIOIA]

 

Yeah, that's a great question. I think Maggie’s Place is definitely in the growth phase of data and learning from our data, and evaluating the data we do have. So back in 2015, Maggie’s Place joined HMIS, which is Maricopa's homeless management system. And so we've really tailored how we track our client data within the system. So we do our intake and we gather all sorts of information from our clients. So demographic information.

 

We track the reasons why they're experiencing homelessness, how long they've been homeless, their job history, and then also history with substance use or domestic violence. So we capture all of that information at the beginning during our intake, so that the moms that we serve can tailor their goals that they have with their case manager to really utilize that year, year and a half that they live with us at Maggie’s Place.

 

We are involved at a broader level, with an impact evaluation through Notre Dame. So we are studying the effectiveness of maternity homes. So it's a five-year study that we are participating with four other maternity homes and we're tracking the moms that do participate in our program versus the moms that don't participate in our program.

 

So I'm really excited about this, just to see the outcomes with that of, you know, can we provide more of that concrete evidence that we're discussing, to larger stakeholders on the effectiveness of maternity homes. Additionally, tying back to that growth period that we're in, we are working with Mathematica very closely on evaluating the systems that we have right now.

 

So how do we clean up all of those tracking spreadsheets to make it the most efficient that we can? So that when we present to stakeholders and grant funders, we're able to capture what we're learning in easy, easy, grabs. Yeah, if you will. And then Stephanie can talk a little bit about our connection to AmeriCorps.

 

[J.B. WOGAN]

 

Okay. That, yeah, that'd be great. Yes, Stephanie, if you want to talk a little bit about the role that AmeriCorps plays in supporting Maggie’s Place.

 

[STEPHANIE GARIPPA]

 

Yes, for sure. So we, also within the last year have participated in a life-cycle evaluation through AmeriCorps with several other programs to look at the effectiveness of our peer recovery coaching with our clients.

 

And so there's a lot of overlapping things that we're assessing things that we're looking at, and that we have been privileged to be invited to be a part of through AmeriCorps. I think the AmeriCorps program itself really lends itself, and its grantees and subgrantees, to being evidence-informed and evidence-based.

 

The way that the program is set up to guide programs like ours, through learning what it means to be evidence-based, through building in, different structures that help support that. We just started our second three-year grant cycle, with AmeriCorps. And as part of that process, programs submit an evaluation proposal.

 

We think, AmeriCorps helps us to think from the beginning of a grant cycle about how we're going to evaluate the effectiveness of the program come throughout the course of that grant cycle and provides the tools to do so as well. One of the things about the life-cycle evaluation that was very beneficial for us was when we several of us on staff participated in some capacity-building sessions over the course of a year where we were actually able to not only participate in the evaluation, but learn how to conduct and the learn the nuts and bolts of evaluation processes that we could implement on our own.

 

And so, subsequently, we did an internal small process study on our own last year to learn what were the ways in which implementation of our initial three-year AmeriCorps cycle. What went according to plan? How did we pivot? How did we adapt? Especially during the COVID pandemic. And, and that was a great learning process for us.

 

[J.B. WOGAN]

 

Is that, this is, I think, a question for both you, Stephanie and Diana, but, you've both talked about ways in which you're gathering more data, participating in evaluations. Is there anything scary about that from a, from an organizational standpoint? Like any, any fear when you enter into that kind of partnership or a project that you may get information you didn't want to receive. You know, what if you find out that the outcomes aren't what you were hoping for? And how, what motivates you to still sort of seek out that information even if there is some risk involved?

 

[DIANA GIOIA]

 

Yeah, I think you kind of touched on it, but definitely the communication to like larger leadership is something that I continue to work on because we're gathering facts, we're gathering evidence, and sometimes it is what we want to see, and sometimes it's not. And so it's thinking about how do we grow? How do we address challenges that we may be getting feedback from? So definitely communication is something that I'm, I'm really, yeah, focused on. And I think it is, it can be scary.

 

[STEPHANIE GARIPPA]

 

And I would say one of our strengths as an organization is curiosity and having a learner's mindset.

 

We are not afraid to make mistakes, as long as we learn from them. We are committed to learning everything that we can through best practices and then trying it. And I think that that is one of the key components to successfully incorporating data and evidence into a program like ours. It's very grassroots.

In, you know, some ways in terms of our being in touch directly with those we're serving, is it's all about how we approach it. It's all about what can we learn. It's better to know than to keep doing something that may be ineffective and not know. So if we can have that mindset, it sets us up to learn and grow together. And, to do better which at the end of the day is what our clients and our communities deserve.

 

[J.B. WOGAN]

 

Have there been any examples that are top of mind where you have had to pivot or tweak, tweak something based on learning that something isn't quite working the way you had hoped it would?

 

[STEPHANIE GARIPPA]

 

Yes, I think what comes to mind is, we have been on a journey of becoming a trauma informed-organization for quite a few years now. So along those lines,  we have tried a few different programmatic approaches, and we have tried adjusting some of the guidelines, some of our requirements of, of, of the moms that live in our homes and the functioning of our homes.

 

Some of those adjustments were exactly what was needed for the program to work well. And others of those adjustments we learned didn't work for our model, even though they did work well for other organizations who were doing work with similar populations. And so it was through both formal and informal feedback and evaluation that we were able to learn and pivot and, continue to build on the strengths of our, our programming and our model and, and scrap something that we tried that was new but didn't work for us.

 

[DIANA GIOIA]

 

Yeah, it's a term that, I like to say, is like fail quickly. So I think that applies to how we approach situations. And that's something that Mathematica has helped us realize too, throughout our evaluation, with AmeriCorps is do a road test, see what feedback you have. And if you fail, it's okay, but it's better to fail early on in the process.

 

[J.B. WOGAN]

 

Okay. Yeah, that's great. I love the idea of road testing, you know, small-scale, short-term, get information quickly and then adjust.

 

Scott, I want to turn back to you. We were talking a little bit about the Evidence Exchange earlier, which is a data source for a report produced by Mathematica on behalf of AmeriCorps last year.

 

Could you talk a little bit about what the State of the Evidence report is and, maybe share some of the takeaways there. What did last year's report reveal about AmeriCorps and its impacts?

 

[SCOTT RICHMAN]

 

Certainly. so with the State of the Evidence report, what we did was building off of a prior report that the agency did in 2017 is to look at the large body of literature that has been built and developed since 2017. So, we looked at over 115 studies or so, of places like Maggie’s Place that are doing their own evaluations, you know, what ORE is producing, what, Office of Research and Evaluation, research grantees are doing. So pulling together this large body of research that's been created since 2017, and really try to identify key themes around the places where AmeriCorps is seeking to make an impact.

 

So that's, AmeriCorps participants. So those are the, the members, the volunteers that are serving through AmeriCorps, the partnering organizations, you know, those are the grantees, the sponsoring organizations that work with AmeriCorps, with the funding they get, what are the impacts on communities, right? So as these services are coming in across the country to, you know, to address local challenges, you know, what are, what is the impact of these programs and service, a national service, on those local communities and the challenges there they may be experiencing. And then another area was to help synthesize some of the information on national service and civic engagement more broadly.

 

You know, so these are the, the, the, you know, the four layers, you know, participants, partners, communities, societies, society, that's where AmeriCorps seeking to make an impact. And we had a large number of studies that looked at all that. So, our goal was to kind of identify some of the key themes in those areas.

 

So, so for example, the participants, Mary mentioned earlier with some of the survey data, the member exit survey data, you know, trying to get understanding, understanding, for example, of what are the cultural competencies that AmeriCorps members possess when they finish their service term and looking at data there. For partners, you know, how are AmeriCorps members and volunteers helping to build the capacity of those sponsoring organizations or partner organizations, to meet their, their project goals or to just increase their capacity to, you know, sustain themselves in their, in their communities.

 

In the case of community impacts, you know, we found, based on the large body literature that there's a, a lot of evidence for the impacts on education outcomes and environmental stewardship outcomes, you know, in terms of the national service programs that are operating in those areas and, programs that are focusing on those areas, but also some of the, all the other AmeriCorps focus areas, the State of the Evidence report highlighted how they helped to generate a return on investment. So when you contribute funding to those programs, not only are they making an impact in their communities, but they're actually demonstrating value after the fact. So, we, the State of the Evidence report highlighted some key findings there.

 

It also highlighted, scaling, you know, when you find an effective solution in one place, how do you take that and bring it to other communities and hopefully, replicate that success elsewhere. So, we touched on some of the studies that AmeriCorps had already produced in that area, and, you know, how that all also accumulates into into the SCALER tool that can help prepare, for that organizations, you know, in their potential scaling aspirations.

 

And then the, in the final bucket on the society level, you know, we, the State of the Evidence report highlights, data from the civic engagement and volunteering supplement as part of the community population survey, just to get a sense of what are the national rates of volunteering, you know, and, and, what are some of differences that may exist among demographic groups.

 

But then, and Mary mentioned earlier, the, the Office of Research and Evaluation research grantees, you know, who are doing new and innovative work, just understanding national service and civic engagement more broadly.

So what are, what are, what are some new cutting-edge methods and, and, and findings from those, from those teams? You know, and one, for example, you know, really digging into, participatory research methods, for example, where you're using research, but you're really engaging and incorporating community members into both the design of the research, but also to address the community challenges that, that, that are identified in the community.

 

So, in a nutshell, the State of the Evidence report is trying to do a lot, but that's because AmeriCorps is trying to do a lot and really, highlight the key findings in each of those four areas that is trying to make an impact, both to highlight, not just the lessons learned and where, where there's a lot of evidence, but also, you know, pointing to where more research is needing, you know, needed.

And, you know, AmeriCorps is the learning organization as it is as well. They're always looking for ideas of where to go next with their, with their learning agenda. So the State of the Evidence report was also had that in mind in terms of, and as you're thinking about future, you know, research efforts, you know, where, where, where could more effort be needed or, or, or, or, you know, more research needed or where, where are some areas that could be strengthened.

 

So, that's, that's what we tried to do. And, with the State of the Evidence report, and then of course the, the evidence base continues to grow. So I'm sure there'll be future synthesis efforts down the road.

 

[J.B. WOGAN]

 

Let me pick up on two, you talked about participants and cultural competencies. And so if, if I'm, you know, graduating from college and I'm assessing whether I want to apply for an AmeriCorps position, you know, does the evidence suggest that I'll be set in some way that they're, they're, what are the benefits to, you know, my potential career aspirations? Like, you know, is there anything it suggests in terms of, earning potential or ability to secure certain kinds of jobs down the line if I'm, if I become an AmeriCorps member, what does the evidence suggest in terms of the benefits of being an AmeriCorps member?

 

[SCOTT RICHMAN]

 

Yeah. So, we definitely, we reviewed, a few studies, outcome studies that did show that there are, changes in education and, and, and employment outcomes, you know, from a, you know, pre-test post-test standpoint. So there is evidence suggests that there are benefits, you know, but, you know, one area we highlight is needing more impact studies just to further strengthen that, that argument that, you know, by, joining AmeriCorps and serving through AmeriCorps, you know, that, your service will have these impacts on those outcomes.

 

But there are studies in that area and just, it's an area that probably needs more evidence. but there is, there's a lot to suggest that there are positive benefits.

 

[J.B. WOGAN]

 

And then in terms of civic engagement, I, I, I, I'm sorry, I'm not familiar with what, what the data show. Has civic engagement gone up or it does. Is there, is there data that shows that AmeriCorps in specific parts of the country or specific moments has increased civic engagement. What does, what does the data show in terms of that specific indicator?

 

[SCOTT RICHMAN]

 

Yeah. So, so I'll say, for the State of the Evidence report, we, we reviewed the the analysis that was on the, the CEV, the Civic Engagement and Volunteering Dataset through, through 2019. And there's going to be, a new dataset, coming out and that I know, ORE is working on this year.

 

We're excited to highlight those findings for the, for 2023. so the, the, the rates they did, they remain stable, you know, I would say over time, but there were some differences, in terms of the demographic or socioeconomic and family by family characteristics. So I think, if I recall correctly, like around, 30 percent of Americans, for example, reported that they volunteered for an organization or for an association again, we'll we're excited to see the 2023 numbers on that area.

 

But there are differences, you know, so, in terms of, for example, women volunteer at a higher rate than men. Race ethnicity is related to the extent to which, the, the, the frequency or the rates of volunteering, generations. So things, all these different factors help to understand not just, how much volunteering and civic engagement is happening, but, you know, with, you know, with whom it's happening, and all those things just help paint, you know, an important and vibrant picture of what is civic life in America, which is an important cornerstone of society.

 

So it's a wealth of information and that's, and new information and new data is coming out soon.

 

[J.B. WOGAN]

 

Okay, great. And we'll link in our show notes to the State of the Evidence report so people can dig in. Because as you said, it's a big, it's a big report and it's trying to cover a lot of information.

 

Mary, I did want to turn back to you. I'm curious how the State of the Evidence report influenced the agency's thinking about data and evidence.

 

[MARY HYDE]

 

Yeah, no, as Scott mentioned, it was one of the more comprehensive reviews we were able to do on the body of evidence that the agency and our partners have generated over the last five years, so it was incredibly useful to see it all in one place, to see where in our spheres of influence, we seem to have more evidence versus less evidence.

 

And I can say that, you know, we had a hunch where we had less evidence, and that was particularly around our partnerships with organizations. Scott mentioned, you know, we, we tried to systematically look at what additional capacity our programs and resources might offer to an organization. But we're also very interested in what kind of partner we are to our partner organizations.

And we just haven't had a systematic way to assess that. And so as a result of seeing and having this sort of, our hunch in this gap validated, we really are in the middle of piloting a partner survey where we can capture that information and that feedback on a more regular cadence and in a more sort of centralized and systematic way, not dissimilar to our member exit survey. We really recognize that gap in our own data collection. So it was very useful in pointing us in a direction that we had been talking about as an agency for a number of years, but this very much helped us validate that, yes, indeed, we do need to develop some stronger data in that area.

 

And it's, you know, generated a lot of conversations. I would say that the State of the Evidence framework also helped us, figure out where we might be able to sort of step back a little from the evidence building and where we might need to fine tune it. And I'll go back to the conversation you were having around sort of our participant outcomes.

 

We have a lot of evidence and data in this space and it's not by accident since it's been a primary focus of the agency. But what we are realizing is we don't sort of have the, the drill-down, the more nuanced information. So, for example, what types of leadership skills do people who participate in our programs develop? What kind of civic leaders do they become? Where do they go to work? Prior work suggests that they go to work in the non-profit space. They often go to work for the organization that they've served in. For our Seniors program, we are piloting, you know, workforce development programs, you know, for second, third careers. There's just a lot of detail about those outcomes that we don't know. While we have a sense that they're positive, and that things are going according to plan, you know, we would like to know a little bit more specifics around the types of careers, the types of training it requires, and sort of that, that kind of data, which doesn't show up, I think, a lot in the state of the evidence.

 

So, just two examples of where it was informative, it was nice to have it all sort of laid out in one place, but it also pointed to us where our future efforts need to go.

 

[J.B. WOGAN]

 

That's fascinating. It brings to mind. I have a lot of friends who did Teach for America out of college, and it's been interesting to track where they've gone after doing Teach for America. Some of them continued on as teachers. Some of them were teachers for a while, and then, you know, moved into other education roles.

 

When I was at a magazine, our education reporter had at one point been a TFA alum. Another person ended up being in politics and his, his issue area is, is education. So it's, it's interesting to see how that, that, that early experience in a similar kind of AmeriCorps-type role, has influenced people's careers, but not necessarily going to work for the organization that they originally were volunteering at.

 

One of the kinds of studies that Mathematica looked at Scott was return-on-investment studies, and I think that that might be something people would be surprised, but it might not be the first thing top of mind that people think about in terms of evaluating the impacts of AmeriCorps, given its mission around civic engagement and, but yeah, talk a little bit about that specific kind of study, return on investment, what are they? Why do it? Maybe what we've learned from the R.O.I. studies.

 

[SCOTT RICHMAN]

 

So, so R.O.I. studies. What they do is they help quantify a program's benefits relative to the costs, right? And doing that kind of a benefit-to-cost analysis, that can help you get a better sense of what kind of, financial value, that a program can generate.

 

So, you know, we know, funding and financial resources, they're finite, right? So it is, they can really be an important tool to help you to determine, you know, if I have this amount of money or this amount of funding, where should I put that towards? You know, what's, what's going to give you the most bang for my buck, for example.

 

So, so what, what, an R.O.I return investment study will do is, you know, kind of, is quantify, the costs that are associated with implementing that program, you know, or, and then look at what are some of the financial benefits that, that get produced because of that and do, do, you know, a relative comparison of those.

 

So when we looked at the evidence, you know, related to those R.O.I. studies and AmeriCorps, I think, today has produced about 16 of those studies, and I know more more are on its way. They found across those different focus areas, that of interest to the agency that, with programs that are funded in those areas, they generate a positive return on investment.

 

So, you know, so for every dollar in that much more gets returned in value when you look at the, you know, the costs and the overall benefits and again, it's really another tool in the toolbox to help you make, data-informed decisions about, you know, where to, where to put funding, you know, you know, a program can make an impact, but some folks would say, is it worth it?

You know, or should I take that funding and put it elsewhere? So, those R.O.I. studies help provide a little bit more information to help you make that determination from a, from a, a funding standpoint.

 

[J.B. WOGAN]

 

Mary, I know we talked earlier about audiences. I asked a question about who the audiences are for the evidence that you put out. Why is a return-on-investment study or even evidence of impact more generally important for communicating the work of AmeriCorps?

 

[MARY HYDE]

 

Well, I think the return-on-investment analyses are particularly important and salient for appropriators. So, you know, the monetization of the costs and benefits of a program creates very tangible metrics for people.

 

You're going to give us this much money, and as a result, you can expect this much back in terms of your social benefits, in terms of costs that you may not have, you no longer have to pay because of of this particular program. So it really is a metric and a way of looking at the evidence of a program that resonates with the folks who are, you know, making decisions about how much funding to invest in these types of services.

 

So for us, it's, it's just speaking the language of certain stakeholder groups, but I think where it's perhaps different than maybe some of these types of analyses that were done 10, 15 years ago, because we have such a strong foundation of evidence for programs and for participants, I think the ability to create these kind of estimates are more robust. I think the estimates are more credible because they're starting from a place of knowing that these programs are generating the outcomes that they think they're generating. In many cases, that's using a comparative design, and I think that just makes for a much more compelling cost-benefit ratio. And that is ultimately, I think, what some stakeholder groups are most interested in.

 

[J.B. WOGAN]

 

Stephanie, I want to turn back to you. In terms of, in terms of demonstrating impact, showing the value of, of the work that, AmeriCorps, program is, is doing, or a grantee program is doing. Can you share any stories or examples of how evidence has improved a program or improved an outcome for the resident, residents that you serve at Maggie’s Place?

 

[STEPHANIE GARIPPA]

 

Yes, so one of our primary objectives over the last few years has been to increase our, the length of stay for mothers in our homes, recognizing that the longer a mother stays in one of our homes, the stronger her tools are when she moves out. And so, we have actually seen an increase over the last few years in that length of stay, going from an average of about four months, to about seven months, which may seem small and incremental, but in our work, a, a woman might stay with us for a night or a week, or a year.

 

And so that little bit of incremental change, has, has been a huge for us. We've really used, sought to use the lens of evidence to update many of our practices as well, in order to increase that length of stay. And so, we have used evidence to adjust our programmatic guidelines, to adopt trauma informed care practices, to adopt an evidence-based curriculum, that our AmeriCorps members administer called Seeking Safety.

 

And it is a resilience-building curriculum that has multiple uses and I think has been, very instrumental in helping increase length of stay and overall sense of satisfaction with the program. And so, simultaneously, we have shifted our outcome focus within the last year, even, for our AmeriCorps program from being more generic to more specific, measuring housing, employment, sobriety, specifically.

 

And, and I think similarly to what I mentioned earlier, we've begun thinking about evidence, not as a one and done, not as a we've adopted a practice and checked that off the list, but as a continual process, and we are seeking to build our organizational capacity accordingly in order to sustain that direction.

 

[J.B. WOGAN]

 

All right. So, I want to, wrap up the today's conversation with a future, kind of a future-oriented question, for the group. And love for each and everyone to, to weigh in. So what's on your to-do list or goal sheet for the next couple of years in terms of generating and using data to further the work of AmeriCorps and Maggie’s Place, and I guess in the case of Mathematica, maybe Mathematica's role in furthering the work of AmeriCorps and/or Maggie’s Place.

 

Diana, Stephanie, I don't know which, which are both of you want to start, but I was hoping you could start talking a little bit about Maggie’s Place.

 

[DIANA GIOIA]

 

Yeah, I can speak to the client focus. So definitely building and organizing our infrastructure, specifically with technology. I think we've just grown exponentially as an organization within the past few years with our bed capacity and, we added more shelters. And so, my goal is to make sure that 10, 15 years down the line, the organization had this structure set up on the front end so that when we look back, we can say this is organized, this has been well kept, we know where to find this information. So that it's more of like a forward-thinking approach. So yeah, definitely the technology and additionally, I think also visualizing the data we have. So I, yeah, I share an office with one other colleague and we're data nerds and we have all of our spreadsheets and I think communicating that to our, yeah, grant funders and leadership can feel very intimidating. So we are focused on, we have a year-end number count, and so how do we present these ten, the ten-page documents of, like information and like infographics in a way that makes sense to the general public.

 

[STEPHANIE GARIPPA]

 

And I would say with that, really utilizing the data that we have in decision making and doing that on a continual basis, on a daily basis, a weekly basis is something that, is at the forefront of my mind, as I think through the programming of our maternity homes and making sure that if we're going to keep something the same, it's based in data, and if we're going to change something, it's also based in data.

 

[J.B. WOGAN]

 

Well, Diana, I just want to say, we love spreadsheets and data nerds here. So we, we hope that, that you, you keep, keep on keeping on.

 

So I, I am going to turn to you now, Scott, any aspirations, what's on your to do list or goal sheet or, or vision board for the coming years in terms of, using data and evidence to further the missions of Maggie’s Place and AmeriCorps?

 

[SCOTT RICHMAN]

 

Yeah, so our goals for the upcoming year is to continue collaborating with the Office of Research and Evaluation and really just try to shine a light on the great research and evidence that AmeriCorps helps to produce and we really want to continue to put that information out there in the hands of those who can, who need it and can also put into good use. So one example, and I mentioned earlier, the 2023, CEV data, that was, that was released not too long ago. And so we're, one thing we're going to be working on in the upcoming year is, providing fact sheets and other resources to really provide an updated picture of what civic engagement looks like in America and really just help provide a pulse of what civic life is looking like in society today. So that's one, set of products we're really looking forward to working on this year.

 

And then another line of work we're going to be doing with the Office of Research and Evaluation is to create a whole suite of capacity building supports, for AmeriCorps staff. You know, they're the ones working with grantees and partners, and they're the, and those grantees and partners are the ones building evidence. So we're going to be creating a host of tools and resources to build staff capacity to support those grantees, in building evidence and infusing data into their program, because that helps then to ultimately produce more, high quality evaluations in the backend.

 

But yeah, so we, those are just two examples, but I, our, our overall goal is to really, support ORE and, and communicating, disseminating all the great work that's coming out of the agency, to really shine a light on, you know, the, what's the evidence for national service, but to continue to, you know, prioritize their, their goals of putting information out there to good use and building capacity and really just improving civic life in America in general. So that's a few things that we're looking forward to working on this year.[JW1] 

 

[J.B. WOGAN]

 

Okay. Excellent. And as I said, Mary, I'd love for you to have the last word here. What's, what's on your to do list? What, how, what is on your agenda in terms of using data and evidence to further AmeriCorps mission?

 

[MARY HYDE]

 

Well, I would certainly echo what everyone has said, but, you know, ultimately, our goal is to continue using our impact framework to develop the most relevant and robust evidence that we can so that communities can use it to offer the most effective public services available. I would also hope that we would use this evidence inside of our own organization to foster a culture of learning, to use this information to help us make better policy decisions, make better programmatic decisions, identify best practices based on the evidence that we're generating in partnership with our partner organizations.

 

And I think that, you know, I aspire to build evidence for those who are curious, to borrow Stephanie's language. I love her framework. I think that ultimately, this evidence is in service to creating better lives for people, and that's what my goal is, is to continue to do that with the hope that it is useful to those who most need it and most want to improve the conditions in their communities.

 

[J.B. WOGAN]

 

Okay, I think that's a great note to end on. Stephanie, Diana, Scott, Mary, thank you so much for speaking with me today. And if I, I haven't said it yet, I'm sure it'll be in the intro, but, happy, happy 30th birthday to AmeriCorps too. It's quite a milestone.

 

[MARY HYDE]

 

Thank you. Thank you.

 

[STEPHANIE GARIPPA]

 

Thank you all very much.

 

[J.B. WOGAN]

 

Thanks again to our guests, Dr. Mary Hyde, Stephanie Garippa, Diana Gioia, and Scott Richman. And thank you for listening to On the Evidence, the Mathematica podcast. This episode was produced by my Mathematica colleague, Rich Clement, and made possible with financial support from AmeriCorps.

 

If you liked this episode, please consider leaving us a rating and review wherever you listen to podcasts. To catch future episodes of the show, subscribe at mathematica.org/ontheevidence.

Show notes

Read the 2023 AmeriCorps State of the Evidence Report.

About the Author

J.B. Wogan

J.B. Wogan

Senior Strategic Communications Specialist
View More by this Author