682 results for search: %E3%80%8C%EC%97%94%EC%A1%B0%EC%9D%B4%ED%8F%B0%ED%8C%85%E3%80%8D%20WWW_BEX_PW%20%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%9E%9C%EC%B1%97%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A6%AC%EC%96%BC%ED%8F%B0%ED%8C%85%E2%86%92%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%E2%9C%81%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%EA%B5%AC%ED%95%A8%E3%8A%8C%E3%81%86%E8%B9%9Eimparkation/feed/content/colombia/privacy


Syria’s status, the migrant crisis and talking to ISIS

In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.


Rise of the racist robots – how AI is learning all our worst impulses

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.


Estimating Deaths in Timor-Leste


Hunting for Mexico’s mass graves with machine learning

“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”


Doing a Number on Violators


Analyze This!


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


Estimating the human toll in Syria

Megan Price (2017). Estimating the human toll in Syria. Nature. 8 February 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behaviour. ISSN 2397-3374.


Los asesinatos de líderes sociales que quedan fuera de las cuentas

Una investigación de Dejusticia y Human Rights Data Analysis Group concluyó que hay un subconteo en los asesinatos de líderes sociales en Colombia. Es decir, que el aumento de estos crímenes en 2016 y 2017 podría ser incluso mayor al reportado por las organizaciones y por las cifras oficiales.


Predictive policing tools send cops to poor/black neighborhoods

100x100-boingboing-logoIn this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.


El problema del asesinato a líderes es más grave de lo que se piensa

Una investigación de Dejusticia y Human Rights Data Analysis Group  asegura que en Colombia hay un subregistro de los asesinatos de líderes sociales que se han perpetrado en Colombia. Al analizar las diferentes cifras de homicidios que han publicado diversas organizaciones desde 2016, se llegó a la conclusión que la problemática es mayor de lo que se cree.


Calculations for the Greater Good

Rollins School of Public HealthAs executive director of the Human Rights Data Analysis Group, Megan Price uses statistics to shine the light on human rights abuses.


Recognising Uncertainty in Statistics

100x100-the-engine-roomIn Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”


Counting the Unknown Victims of Political Violence: The Work of the Human Rights Data Analysis Group

Ann Harrison (2012). Counting the Unknown Victims of Political Violence: The Work of the Human Rights Data Analysis Group, in Human Rights and Information Communications Technologies: Trends and Consequences of Use. © 2012 IGI Global. All rights reserved.


Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.

Romesh Silva and Jasmine Marwaha. “Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.” In JSM Proceedings, Social Statistics Section. Alexandria, VA. © 2011 American Statistical Association. All rights reserved.


On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations

Romesh Silva. “On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations.” ASA Proceedings of the Joint Statistical Meetings, the International Biometric Society (ENAR and WNAR), the Institute of Mathematical Statistics, and the Statistical Society of Canada. August, 2002.


Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis

Patrick Ball, Herbert F. Spirer, and Louise Spirer, eds. Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis . © 2000 American Association for the Advancement of Science. All rights reserved. Reprinted with permission. [full text] [intro] [chapters 1 2 3 4 5 67 8 9 10 11 12]


Different Convenience Samples, Different Stories: The Case of Sierra Leone.


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


Death Toll In Syria Jumps To Nearly 93,000


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate