675 results for search: %ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%EB%A7%88%EC%BC%80%ED%8C%85%EB%AC%B8%EC%9D%98%E2%99%A5%EC%B9%B4%ED%86%A1%40adgogo%E2%99%A5%ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%E3%82%9D%EA%B0%95%EC%B6%94%E2%94%92%EB%A7%88%EC%BC%80%ED%8C%85%ED%9A%8C%EC%82%AC%DB%A9%EB%B0%94%EC%9D%B4%EB%9F%B4%EB%8C%80%ED%96%89%EC%82%AC%E4%AB%A0%ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%E7%9C%8Crabidity/feed/content/india/privacy
Predictive policing violates more than it protects
William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.
Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project
Patrick Ball. Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project. © 1996 American Association for the Advancement of Science.
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
Testimonials
Foundation of Human Rights Statistics in Sierra Leone
Richard Conibere (2004). Foundation of Human Rights Statistics in Sierra Leone (abstr.), Joint Statistical Meetings. Toronto, Canada.
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
South Africa
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Data Mining for Good: CJA Drink + Think
Sierra Leone TRC Data and Statistical Appendix
Welcoming Our 2021-2022 Human Rights and Data Science Intern
Third CLS Story
Fourth ALGO story
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
The Limits of Observation for Understanding Mass Violence.
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”