712 results for search: %ED%99%8D%EB%B3%B4%EC%A0%84%EB%AC%B8%E3%85%BF%ED%85%94%EB%A0%88adgogo%E3%85%BF%EA%B0%81%EC%82%B0%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%BF%ED%99%8D%EB%B3%B4%E2%94%BA%EC%A0%84%EB%AC%B8%E2%82%AA%EA%B0%81%EC%82%B0%E4%9D%90%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E7%AE%A0nonbeing/feed/rss2/jeffklinger/privacy
Welcoming Our New Data Scientist
Counting the Dead in Sri Lanka
HRDAG at FAT* 2020: Pre-Trial Risk Assessment Tools
HRDAG’s Year End Review: 2019
Welcoming Our 2019-2020 Visiting Data Science Student
Welcoming Our 2019 Visiting Analyst
HRDAG and Amnesty International: Prison Mortality in Syria
New death toll estimated in Syrian civil war
Stay informed about our work
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
The Allegheny Family Screening Tool’s Overestimation of Utility and Risk
Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.
Big data may be reinforcing racial bias in the criminal justice system
Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.
Predictive policing violates more than it protects
William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.
Welcoming Our 2021-2022 Human Rights and Data Science Intern
Death rate in Habre jails higher than for Japanese POWs, trial told
Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.
HRDAG Wins the Rafto Prize
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
