How Data Analysis Confirmed the Bias in a Family Screening Tool
For the last six years, the Department of Human Services and the child welfare system in Allegheny County, Pennsylvania, have been using a family screening tool—an algorithm—to predict which children face danger in their homes. The algorithm, powered by artificial intelligence, crunches data about the parents to assign them a “risk score” that social workers use to help them make decisions about custody. Some of the factors that feed into the risk score calculations include race, poverty status, family size, and disability. For example, the algorithm flagged people who used county services for mental health programs to assist with disabilities such as attention deficit hyperactivity disorder. Flags such as these contribute to a higher risk score.
The ACLU became involved in an examination of the Pennsylvania tool when the Department of Justice became concerned that the Family Screening Tool, as it is called, was “forever flagging” parents with disabilities and disproportionately removing their children to foster care. The ACLU asked HRDAG to analyze how the algorithm works and whether it could be discriminating against parents with disabilities. HRDAG’s analysis showed that it could be.
HRDAG comments in the report that “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators … They are forever seen as riskier to their children.”
Today, HRDAG remains close with the ACLU team as they follow developments with the family screening tool and algorithms in other locations.
Image: David Peters, 2024. Caption: HRDAG + ACLU investigated why a child welfare system was flagging disabled parents in Pennsylvania.
Further readings
Techtonic Justice. Kevin De Liban. November, 2024
The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive
Twitter (X). Marissa Gerchick. 12 June, 2023.
“A quick thread on our @ACLU @HRDAG audit of the Allegheny Family Screening Tool”
AP News. Sally Ho + Garance Burke. 15 March, 2023.
Here’s how an AI tool may flag parents with disabilities.
HRDAG is mentioned.
Related publications
Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.
Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool. ACLU. Summer 2023.
Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool. Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency. June 2023. Pages 1292–1310.
Acknowledgments
HRDAG was supported in this work by MacArthur Foundation, Ford Foundation, Heising Simons Foundation, and Open Society Foundations.
See more from Exposing algorithmic discrimination