How Data Analysis Confirmed the Bias in a Family Screening Tool

Map of Pittsburgh region

# HRDAG + ACLU investigated why a child welfare system was flagging disabled parents in Pennsylvania. #

Being unfairly flagged as “high risk” by a child welfare system can put parents on a long and grueling path. It can result in having a child removed to foster care, followed by a long fight to regain custody. But who—or what—is doing the flagging, and how do they make their decisions?  

For the last six years, the Department of Human Services and the child welfare system in Allegheny County, Pennsylvania, have been using a family screening tool—an algorithm—to predict which children face danger in their homes. The algorithm, powered by artificial intelligence, crunches data about the parents to assign them a “risk score” that social workers use to help them make decisions about custody. Some of the factors that feed into the risk score calculations include race, poverty status, family size, and disability. For example, the algorithm flagged people who used county services for mental health programs to assist with disabilities such as attention deficit hyperactivity disorder. Flags such as these contribute to a higher risk score.

The ACLU became involved in an examination of the Pennsylvania tool when the Department of Justice became concerned that the Family Screening Tool, as it is called, was “forever flagging” parents with disabilities and disproportionately removing their children to foster care. The ACLU asked HRDAG to analyze how the algorithm works and whether it was discriminating against parents with disabilities. HRDAG’s analysis showed that it was. 

HRDAG comments in the report that “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators … They are forever seen as riskier to their children.”

Today, HRDAG remains close with the ACLU team as they follow developments with the family screening tool and algorithms in other locations.

Image: David Peters, 2024.

Further readings

Twitter (X). Marissa Gerchick. 12 June, 2023.
“A quick thread on our @ACLU @HRDAG audit of the Allegheny Family Screening Tool”

AP News. Sally Ho + Garance Burke. 15 March, 2023.
Here’s how an AI tool may flag parents with disabilities.
HRDAG is mentioned.

Related publications

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool. ACLU. Summer 2023.

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool. Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency. June 2023. Pages 1292–1310.

Acknowledgments

HRDAG was supported in this work by MacArthur Foundation, Ford Foundation, Heising Simons Foundation, and Open Society Foundations.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate