707 results for search: %EB%82%A8%EA%B5%AC%ED%9C%B4%EA%B2%8C%ED%85%94%E3%81%ACjusobot%E3%80%81%EF%BD%830m%E2%99%99%EB%82%A8%EA%B5%AC%EA%B1%B4%EB%A7%88%E2%99%A9%EB%82%A8%EA%B5%AC%EC%97%85%EC%86%8C%E2%9C%88%EB%82%A8%EA%B5%AC%EB%A6%BD%EB%B0%A9%20%EB%82%A8%EA%B5%AC%EB%8B%AC%EB%A6%BC/feed/content/colombia/SV-report_2011-04-26.pdf
Update of Iraq and Syria Data in New Paper
HRDAG Names New Board Members Julie Broome and Frank Schulenburg
Get Involved/Donate
Coming soon: HRDAG 2019 Year-End Review
Tech Note – using LLMs for structured info extraction
HRDAG contributes to textbook Counting Civilian Casualties
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
Privacy Policy
Our Copyright Policy
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Sierra Leone Statistical Appendix
Richard Conibere, Jana Asher, Kristen Cibella, Jana Dudukovic, Rafe Kaplan, and Patrick Ball. Sierra Leone Statistical Appendix, A Report by the Benetech Human Rights Data Analysis Group and the American Bar Association Central European and Eurasian Law Initiative to the Truth and Reconciliation Commission. October 5, 2004.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Descriptive Statistics From Statements to the Liberian Truth and Reconciliation Commission
Kristen Cibelli, Amelia Hoover, and Jule Krüger (2009). “Descriptive Statistics From Statements to the Liberian Truth and Reconciliation Commission,” a Report by the Human Rights Data Analysis Group at Benetech and Annex to the Final Report of the Truth and Reconciliation Commission of Liberia. Palo Alto, California. Benetech.