US Flag

USA

HRDAG’s analysis and expertise continues to deepen the national conversation about police violence and criminal justice reform in the United States. In 2015 we began by considering undocumented victims of police violence, relying on the same methodological approach we’ve tested internationally for decades. Shortly after, we examined “predictive policing” software, and demonstrated the ways that racial bias is baked into the algorithms. Following our partners’ lead, we next considered the impact of bail, and found that setting bail increases the likelihood of a defendant being found guilty. We then broadened our investigations to examine the risk assessment tools that judges use to make decisions about pre-trial supervision, and we found evidence of racial bias in the tools. Most recently we have returned to considering the challenges of documenting police violence.

Current work includes ongoing collaboration with several US-based partners, some of which are discussed in this technical note about processing scanned documents for investigations of police violence.

  • We are working with the Invisible Institute’s Citizens Police Data Project to design and maintain a data pipeline, to systematically process large quantities of documents describing potential police misconduct in Chicago. Read “Police Accountability In Chicago: From Data Dump To Usable Data to learn more about this work.
  • We are working with the ACLU of Massachusetts to review Boston PD SWAT reports (these are the reports filled out before and after tactical and warrant service operations) made public under the 17F order, which requires the Mayor of Boston to release information about the Boston Police Department’s inventory of military-grade equipment, such as mine-resistant ambush-protected armored vehicles, designed for use in Iraq. 
  • The University of Washington Center for Human Rights sued under the Freedom of Information Act for every form I-213 (recording apprehensions of people) produced by ICE or CBP in the Washington area for a number of years as part of the Center’s Human Rights at Home research regarding immigrant rights, to answer the question of whether ICE and CBP are detaining people at sensitive locations such as prisons and hospitals. Read “Scraping for Pattern: Protecting Immigrant Rights in Washington State” to learn more.
  • Read “Quantifying Police Misconduct in Louisiana” to learn more about HRDAG’s work with the Innocence Project New Orleans.
  • Read “Police Violence in Puerto Rico: Flooded with Data” to learn more about HRDAG’s work with Kilometro Cero in Puerto Rico.
  • HRDAG worked with investigative journalists to process data acquired from a FOIA request regarding New York City Police Department misconduct. Read “Protecting the Privacy of Whistle-Blowers: The Staten Island Files” to learn more about this work.
  • In 2022, HRDAG and the ACLU partnered on an examination of the Allegheny Family Screening Tool, which relies on artificial intelligence to predict which children could be at risk of harm. 

Police Violence

Violence committed by police officers in the United States is paradoxically highly documented and incompletely documented. We frequently encounter this situation in our work—some victims’ stories are told by many sources, while other victims’ stories are rarely if ever told. This is the gap statistics can fill. 

In 2015, the US Bureau of Justice Statistics concluded that two federal efforts to document deaths that occur during arrest were woefully incomplete. Their analysis relied on the same statistical methods HRDAG’s team has been using internationally for decades, so we examined their approach, considering various assumptions required by the method. This work resulted in an essay in Granta magazine, “Violence in Blue,” written by our director of research Patrick Ball, which concluded that one-third of all Americans killed by strangers are killed by police. Following the murder of George Floyd by Minneapolis police officers on May, 2020, Patrick refreshed the article with a new introduction, writing, “This cannot be acceptable in democracy.”

Key publications on police violence:

Patrick Ball (2020). Refresh of Violence in Blue (Granta, March 2016) with new introduction. 8 June, 2020. © 2020 Granta.

Kristian Lum and Patrick Ball (2015). Estimating Undocumented Homicides with Two Lists and List Dependence. HRDAG, April 2015.

Kristian Lum and Patrick Ball (2015). How many police homicides in the US? A reconsideration. HRDAG, April 2015.Kristian Lum and Patrick Ball (2015). BJS Report on Arrest-Related Deaths: True Number Likely Much Greater. HRDAG, March 2015.

Police Car

photo by Flickr user John Liu, CC-BY-2.0, modified by David Peters

The Problem with Predictive Policing

Among many law enforcement communities, predictive policing software tools such as PredPol are being adopted in an attempt to increase policing effectiveness and to eliminate bias among their agents. The claim made by the enforcement agencies and the software developers is that because predictive policing software uses data—and not human judgment—the tools are free of racial bias.

But our research shows that the use of “big data” reinforces racial bias in these tools, because the tools use data generated by policing practices that target people of color and the communities they live in . 

Key publications on predictive policing:

William Isaac and Kristian Lum (2018). Setting the Record Straight on Predictive Policing and Race. In Justice Today. 3 January 2018. © 2018 In Justice Today / Medium.

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.

William Isaac and Kristian Lum (2016). Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

Kristian Lum and William Isaac (2016). To predict and serve? Significance. October 10, 2016. © 2016 The Royal Statistical Society. [related blogpost]

Investigating Risk Assessment

In many judicial processes, judges rely on pre-trial risk assessment software tools to help them decide the fates of arrested persons. Will she be released on her own recognisance until the trial? Will she be given the option to pay bail in order to secure her release before the trial? Or will she be denied bail (and release)? The software programs that give judges a risk assessment rating are touted as “bias-free,” because they are algorithms—but our research shows that the data used to “train” the algorithms encode racial bias into the tools. 

In examining different risk assessment tools used in different cities, we’ve collaborated on projects with San Francisco’s District Attorney, Chesa Boudin, and with the Criminal Justice Agency in New York City. We’ve also collaborated with The Safety + Justice Challenge to produce a primer on pre-trial risk assessment tools for judges, prosecutors, and defense attorneys.

Key publications on risk assessment:

Kristian Lum, Chesa Boudin and Megan Price (2020). The impact of overbooking on a pre-trial risk assessment tool. FAT* ’20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. January 2020. Pages 482–491. https://doi.org/10.1145/3351095.3372846 ©ACM, Inc., 2020.

Kristian Lum and Tarak Shah (2019). Measures of Fairness for New York City’s Supervised Release Risk Assessment Tool. Human Rights Data Analysis Group. 1 October 2019. © HRDAG 2019.

Sarah L. Desmarais and Evan M. Lowder (2019). Pretrial Risk Assessment Tools: A Primer for Judges, Prosecutors, and Defense Attorneys. Safety and Justice Challenge, February 2019. © 2019 Safety and Justice Challenge. <<HRDAG’s Kristian Lum and Tarak Shah served as Project Members and made significant contributions to the primer.>>

Laurel Eckhouse, Kristian Lum, Cynthia Conti-Cook and Julie Ciccolini (2018). Layers of Bias: A Unified Approach for Understanding Problems With Risk Assessment. Criminal Justice and Behavior. November 23, 2018. © 2018 Sage Journals. All rights reserved. https://doi.org/10.1177/0093854818811379Kristian Lum (2017). Limitations of mitigating judicial bias with machine learning. Nature. 26 June 2017. © 2017 Macmillan Publishers Limited. All rights reserved. DOI 10.1038/s41562-017-0141.

Examining the Impact of Bail

In collaboration with partners at

Legal Aid Society

In collaboration with the New York Legal Aid Society (NYLAS) and Stanford University assistant professor Mike Baiocchi, HRDAG has examined the effect of setting bail on a defendant’s likelihood of a guilty finding, either through a jury’s determination at trial or by taking a guilty plea in advance of trial. Spearheading this collaboration for HRDAG, lead statistician Kristian Lum has used NYLAS datasets and found, with our partners, that setting bail increases defendants’ likelihood of a guilty finding, usually through a plea deal, compared to the outcomes of defendants who are released or remanded.

Key publication on impact of bail:

Kristian Lum, Erwin Ma and Mike Baiocchi (2017). The causal impact of bail on case outcomes for indigent defendants in New York City. Observational Studies 3 (2017) 39-64. 31 October 2017. © 2017 Institute of Mathematical Statistics.

Bail bonds sign

photo by Flickr user Daniel Schwen, CC-BY-4.0, modified by David Peters

Further Discussion

Podcast: Risk Assessment Biases | Stats + Stories, Episode 147 | Tarak Shah | 2020 (28 minutes)

Podcast: Lifelong curiosity and Criminal Justice Reform through data | Origins, Episode 19  | Kristian Lum | 2020 (52 minutes)

Video: Predictive policing and machine learning | Skoll Foundation | Megan Price | 2018 (11 minutes)

Video: Predictive Policing | Data Society and Research Institute | Kristian Lum | 2016 (56 minutes)

Video: Tyranny of the Algorithm? Predictive Analytics and Human Rights | NYU School of Law | Patrick Ball and Kristian Lum | 2016 (51 minutes)

If you’d like to support HRDAG in this project, please consider making a donation via Our Donate page.

Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate