698 results for search: %ED%99%8D%EB%B3%B4%EC%A0%84%EB%AC%B8%E3%85%BF%ED%85%94%EB%A0%88adgogo%E3%85%BF%EA%B0%81%EC%82%B0%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%BF%ED%99%8D%EB%B3%B4%E2%94%BA%EC%A0%84%EB%AC%B8%E2%82%AA%EA%B0%81%EC%82%B0%E4%9D%90%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E7%AE%A0nonbeing/feed/rss2/jeffklinger
Counting the Dead in Sri Lanka
Skoll World Forum 2018
New death toll estimated in Syrian civil war
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
Privacy Policy
The Allegheny Family Screening Tool’s Overestimation of Utility and Risk
Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.
Big data may be reinforcing racial bias in the criminal justice system
Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.
The killings of social movement leaders and human rights defenders in Colombia 2018 – 2023: an estimate of the universe.
Valentina Rozo Ángel and Patrick Ball. 2024. The killings of social movement leaders and human rights defenders in Colombia 2018 – 2023: an estimate of the universe. Human Rights Data Analysis Group. 18 December 2024. © HRDAG 2024. Creative Commons International license 4.0.
Predictive policing violates more than it protects
William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.
Welcoming Our 2021-2022 Human Rights and Data Science Intern
HRDAG Wins the Rafto Prize
Death rate in Habre jails higher than for Japanese POWs, trial told
Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.
Quantifying Police Misconduct in Louisiana
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
At Toronto’s Tamil Fest, human rights group seeks data on Sri Lanka’s civil war casualties
Earlier this year, the Canadian Tamil Congress connected with HRDAG to bring its campaign to Toronto’s annual Tamil Fest, one of the largest gatherings of Canada’s Sri Lankan diaspora.
Ravichandradeva, along with a few other volunteers, spent the weekend speaking with festival-goers in Scarborough about the project and encouraging them to come forward with information about deceased or missing loved ones and friends.
“The idea is to collect thorough, scientifically rigorous numbers on the total casualties in the war and present them as a non-partisan, independent organization,” said Michelle Dukich, a data consultant with HRDAG.