656 results for search: %EB%A7%88%EC%BC%80%ED%8C%85%ED%8C%80%E2%99%A0%E0%B4%A0%E2%9D%B6%E0%B4%A0%2B%E2%9D%BD%E2%9D%BD%E2%9D%BC%E2%9D%BB%2B%E2%9D%BD%E2%9D%BC%E2%9D%BC%E2%9D%BD%E2%99%A0%EB%8F%84%EC%95%94%EB%A9%B4%EB%AA%A8%ED%85%94%E3%84%A5%EB%A7%88%EC%BC%80%ED%8C%85%E2%94%8E%ED%8C%80%16%EB%8F%84%EC%95%94%EB%A9%B4%E8%A8%97%EB%AA%A8%ED%85%94%E3%9D%B2conventicle/feed/content/colombia/privacy
Making the Case: The Role of Statistics in Human Rights Reporting.
Patrick Ball. “Making the Case: The Role of Statistics in Human Rights Reporting.” Statistical Journal of the United Nations Economic Commission for Europe. 18(2-3):163-174. 2001.
Guatemalan National Police Archive Project
Press Release, Timor-Leste, February 2006
Release of Yellow Book Calls on Salvadoran Military to Open Archives
Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project
Patrick Ball. Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project. © 1996 American Association for the Advancement of Science.
On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations
Romesh Silva. “On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations.” ASA Proceedings of the Joint Statistical Meetings, the International Biometric Society (ENAR and WNAR), the Institute of Mathematical Statistics, and the Statistical Society of Canada. August, 2002.
Weighting for the Guatemalan National Police Archive Sample: Unusual Challenges and Problems.”
Gary M. Shapiro, Daniel R. Guzmán, Paul Zador, Tamy Guberek, Megan E. Price, Kristian Lum (2009).“Weighting for the Guatemalan National Police Archive Sample: Unusual Challenges and Problems.”In JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association.
Violence in Blue
Patrick Ball. 2016. Granta 134: 4 March 2016. © Granta Publications. All rights reserved.
DATNAV: New Guide to Navigate and Integrate Digital Data in Human Rights Research
DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule Krüger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.
Pretrial Risk Assessment Tools
Sarah L. Desmarais and Evan M. Lowder (2019). Pretrial Risk Assessment Tools: A Primer for Judges, Prosecutors, and Defense Attorneys. Safety and Justice Challenge, February 2019. © 2019 Safety and Justice Challenge. <<HRDAG’s Kristian Lum and Tarak Shah served as Project Members and made significant contributions to the primer.>>
Existe la posibilidad de que no se estén documentando todos los asesinatos contra líderes sociales
En ocasiones, las discusiones sobre ese fenómeno se centran más sobre cuál es la cifra real, mientras que el diagnóstico es el mismo: en las regiones la violencia no cede y no se avizoran políticas efectivas para ponerle fin. En medio de este complejo panorama, el Centro de Estudios de Derecho, Justicia y Sociedad (Dejusticia) y el Human Rights Data Analysis Group, publicaron este miércoles la investigación Asesinatos de líderes sociales en Colombia en 2016–2017: una estimación del universo.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Calculating US police killings using methodologies from war-crimes trials
Cory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”
Perú
The World According to Artificial Intelligence (Part 2)
The World According to Artificial Intelligence – The Bias in the Machine (Part 2)
Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.
Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”
Limitations of mitigating judicial bias with machine learning
Kristian Lum (2017). Limitations of mitigating judicial bias with machine learning. Nature. 26 June 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behavior. DOI 10.1038/s41562-017-0141.
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.