722 results for search: %EC%95%84%EA%B8%B0%EB%A7%98%EB%AF%B8%ED%8C%85%E2%99%AA%EB%B3%B4%EA%B8%B0%ED%8F%B0%ED%8C%85%E2%80%A2O%E2%91%B9O-%E2%91%BCO%E2%91%B6-O%E2%91%BAO%E2%91%BA%E2%99%AA%20%EC%9D%B4%EB%B0%B1%EB%A7%98%EB%AF%B8%ED%8C%85%20%ED%8C%8C%EC%B6%9C%EB%B6%80%EB%AF%B8%ED%8C%85%E2%88%AE%EA%B3%A0%EC%84%B1%EB%85%80%EB%AF%B8%ED%8C%85%F0%9F%94%8B%EB%8F%99%EC%95%88%EB%AF%B8%EB%85%80%EB%AF%B8%ED%8C%85%20%E8%85%9E%E6%AD%B4completeness%EC%95%84%EA%B8%B0%EB%A7%98%EB%AF%B8%ED%8C%85/feed/content/colombia/privacy
Our Thoughts on #metoo
Reflections: Growing and Learning in Guatemala
Ten Years and Counting in Guatemala
Where Stats and Rights Thrive Together
Welcoming Our New Data Scientist
HRDAG and Amnesty International: Prison Mortality in Syria
How Data Processing Uncovers Misconduct in Use of Force in Puerto Rico
How a Data Tool Tracks Police Misconduct and Wandering Officers
New death toll estimated in Syrian civil war
Stay informed about our work
Introducing Structural Zero, HRDAG’s New Monthly Newsletter
Media Contact
Rise of the racist robots – how AI is learning all our worst impulses
 “If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Tech Note – improving LLM-driven info extraction
¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.
Patrick Ball (2008). “¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.” (originally in English at AAAS) Translated by Beatriz Verjerano. Palo Alto, California: Benetech.
Truth Commissioner
From the Guatemalan military to the South African apartheid police, code cruncher Patrick Ball singles out the perpetrators of political violence.
Death rate in Habre jails higher than for Japanese POWs, trial told
Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.
Hunting for Mexico’s mass graves with machine learning
 “The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
