Mathematica is continuously monitoring this fluid situation, and we are proactively working to minimize any potential impacts on our clients, partners, staff, and the important work that we do together. Learn more.
Can Algorithms Be Fair, Transparent, and Protect Children?
As technology improves organizations’ ability to collect, manage, and analyze data, it’s becoming easier to inform public policy decisions today in a range of areas, from health care to criminal justice, based on estimated risks in the future. On this episode of On the Evidence, three researchers discuss how they work with child welfare agencies in the United States to use algorithms—or, what they call predictive risk models—to inform decisions by case managers and their supervisors.
The guests for this episode are Rhema Vaithianathan, Emily Putnam-Hornstein, and Beth Weigensberg.
Vaithianathan is a professor of economics and director of the Centre for Social Data Analytics in the School of Social Sciences and Public Policy at Auckland University of Technology, New Zealand, and a professor of social data and analytics at the Institute for Social Science Research at the University of Queensland, Australia. Putnam-Hornstein is an associate professor of social work at the University of Southern California and the director of the Children’s Data Network. Weigensberg is a senior researcher at Mathematica.
Vaithianathan and Putnam-Hornstein have already worked with Allegheny County, Pennsylvania, to implement a predictive risk model that uses hundreds of data elements to help the people screening calls about child abuse and neglect better assess the risk associated with each call. Now they are working with two more counties in Colorado to pilot a similar predictive risk model there. Last year, they initiated a partnership with Mathematica to replicate and scale-up their work by offering the same kind of assistance to states and counties around the country.
On the episode, we discuss how they work with child welfare agencies to address common concerns about using algorithms in public policy, such as making algorithms transparent; subjecting algorithm use to independent evaluation; and paying close attention to ethical issues, such as addressing racial bias. Play the full episode below.
Find more information about Mathematica’s partnership with the Centre for Social Data Analytics and the Children’s Data Network here.
Find the publications page for the Centre for Social Data Analytics, which includes research and other resources related to the Allegheny Family Screening Tool, here.
Find the results of an independent evaluation of the Allegheny County predictive risk model here.
Find The New York Times Magazine article about Allegheny County's use of algorithms in child welfare here.