698 results for search: %ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%EB%A7%88%EC%BC%80%ED%8C%85%EB%AC%B8%EC%9D%98%E2%99%A5%EC%B9%B4%ED%86%A1%40adgogo%E2%99%A5%ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%E3%82%9D%EA%B0%95%EC%B6%94%E2%94%92%EB%A7%88%EC%BC%80%ED%8C%85%ED%9A%8C%EC%82%AC%DB%A9%EB%B0%94%EC%9D%B4%EB%9F%B4%EB%8C%80%ED%96%89%EC%82%AC%E4%AB%A0%ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%E7%9C%8Crabidity/feed/rss2/indiafaq
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
Testimonials
How a Data Tool Tracks Police Misconduct and Wandering Officers
Talks & Discussions
Foundation of Human Rights Statistics in Sierra Leone
Richard Conibere (2004). Foundation of Human Rights Statistics in Sierra Leone (abstr.), Joint Statistical Meetings. Toronto, Canada.
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
South Africa
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Data Mining for Good: CJA Drink + Think
Sierra Leone TRC Data and Statistical Appendix
HRDAG Drops Dropbox
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
The Limits of Observation for Understanding Mass Violence.
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
Data-driven development needs both social and computer scientists
Excerpt:
Data scientists are programmers who ignore probability but like pretty graphs, said Patrick Ball, a statistician and human rights advocate who cofounded the Human Rights Data Analysis Group.
“Data is broken,” Ball said. “Anyone who thinks they’re going to use big data to solve a problem is already on the path to fantasy land.”
HRDAG’s Year in Review: 2021
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Matching the Libro Amarillo to Historical Human Rights Datasets in El Salvador
Patrick Ball (2014). A memo accompanying the release of The Yellow Book. August 20, 2014. © 2014 HRDAG. Creative Commons BY-NC-SA.[pdf español]