684 results for search: o %EA%B0%81%EC%A2%85%EB%94%94%EB%B9%84%E2%85%B8%E2%80%98%ED%85%94%EB%A0%88sEiN07%EF%BC%BD%D0%AB%EA%B0%81%EC%A2%85%EB%94%94%EB%B9%84%ED%8C%9D%EB%8B%88%EB%8B%A4%20%EA%B0%81%EC%A2%85DB%EA%B5%AC%EB%A7%A4%20%EA%B0%81%EC%A2%85%EB%94%94%EB%B9%84%ED%8C%9D%EB%8B%88%EB%8B%A4%E3%81%88%EA%B0%81%EC%A2%85%EB%94%94%EB%B9%84%ED%8C%90%EB%A7%A4%ED%95%A9%EB%8B%88%EB%8B%A4/feed/content/colombia/SV-report_2011-04-26.pdf
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
Our Copyright Policy
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Sierra Leone Statistical Appendix
Richard Conibere, Jana Asher, Kristen Cibella, Jana Dudukovic, Rafe Kaplan, and Patrick Ball. Sierra Leone Statistical Appendix, A Report by the Benetech Human Rights Data Analysis Group and the American Bar Association Central European and Eurasian Law Initiative to the Truth and Reconciliation Commission. October 5, 2004.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Descriptive Statistics From Statements to the Liberian Truth and Reconciliation Commission
Kristen Cibelli, Amelia Hoover, and Jule Krüger (2009). “Descriptive Statistics From Statements to the Liberian Truth and Reconciliation Commission,” a Report by the Human Rights Data Analysis Group at Benetech and Annex to the Final Report of the Truth and Reconciliation Commission of Liberia. Palo Alto, California. Benetech.
State Violence in Guatemala, 1960-1996: A Quantitative Reflection
Patrick Ball, Paul Kobrak, Herbert F. Spirer. State Violence in Guatemala, 1960-1996: A Quantitative Reflection. © 1999 American Association for the Advancement of Science. [pdf – english] [pdf – español]
How Structuring Data Unburies Critical Louisiana Police Misconduct Data
The Profile of Human Rights Violations in Timor-Leste, 1974-1999
Romesh Silva and Patrick Ball. “The Profile of Human Rights Violations in Timor-Leste, 1974-1999″, a Report by the Benetech Human Rights Data Analysis Group to the Commission on Reception, Truth and Reconciliation. 9 February 2006.
Guatemala 1993-1999 – Using MSE to Estimate the Number of Deaths
A better statistical estimation of known Syrian war victims
Researchers from Rice University and Duke University are using the tools of statistics and data science in collaboration with Human Rights Data Analysis Group (HRDAG) to accurately and efficiently estimate the number of identified victims killed in the Syrian civil war.
…
Using records from four databases of people killed in the Syrian war, Chen, Duke statistician and machine learning expert Rebecca Steorts and Rice computer scientist Anshumali Shrivastava estimated there were 191,874 unique individuals documented from March 2011 to April 2014. That’s very close to the estimate of 191,369 compiled in 2014 by HRDAG, a nonprofit that helps build scientifically defensible, evidence-based arguments of human rights violations.