676 results for search: %E3%80%8E%EB%8F%84%EB%B4%89%EA%B5%AC%EC%83%81%ED%99%A9%EA%B7%B9%E3%80%8F%20O6O%E3%85%A15O1%E3%85%A19997%20%EC%82%AC%EC%8B%AD%EB%8C%80%EB%8C%80%ED%99%94%EC%96%B4%ED%94%8C%20%EC%BB%A4%ED%94%8C%EC%BB%A4%EB%AE%A4%EB%8B%88%ED%8B%B0%E2%86%95%EB%AF%B8%EC%8A%A4%EB%85%80%EB%8D%B0%EC%9D%B4%ED%8C%85%E2%92%AE%EB%B0%A9%EC%95%84%EC%83%81%ED%99%A9%EA%B7%B9%20%E3%83%8D%E5%AF%9D%20bifoliate/feed/content/colombia/privacy


Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do

In this story about how data are transforming the nonprofit world, Patrick Ball is quoted. Here's an excerpt: "Data can have a profound impact on certain problems, but nonprofits are kidding themselves if they think the data techniques used by corporations can be applied wholesale to social problems," says Patrick Ball, head of the nonprofit Human Rights Data Analysis Group. Companies, he says, maintain complete data sets. A business knows every product it made last year, when it sold, and to whom. Charities, he says, are a different story. "If you're looking at poverty or trafficking or homicide, we don't have all the data, and we're not going to," ...

Stay informed about our work

#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; } /* Add your own Mailchimp form style overrides in your site stylesheet or in this style block. We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */ #mc_embed_signup .mc-field-group input { display: block; width: 100%; padding: 8px 0; text-indent: 2%; color: #333 !important; } Subscribe * indicates required Email Address * First Name Last Name Organization (function($) {window.fnames = new ...

The Death Toll in Syria


Rise of the racist robots – how AI is learning all our worst impulses

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.


Hunting for Mexico’s mass graves with machine learning

“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”


Analyze This!


Doing a Number on Violators


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


One Better

The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:

Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.

“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


Los asesinatos de líderes sociales que quedan fuera de las cuentas

Una investigación de Dejusticia y Human Rights Data Analysis Group concluyó que hay un subconteo en los asesinatos de líderes sociales en Colombia. Es decir, que el aumento de estos crímenes en 2016 y 2017 podría ser incluso mayor al reportado por las organizaciones y por las cifras oficiales.


Predictive policing tools send cops to poor/black neighborhoods

100x100-boingboing-logoIn this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.


El problema del asesinato a líderes es más grave de lo que se piensa

Una investigación de Dejusticia y Human Rights Data Analysis Group  asegura que en Colombia hay un subregistro de los asesinatos de líderes sociales que se han perpetrado en Colombia. Al analizar las diferentes cifras de homicidios que han publicado diversas organizaciones desde 2016, se llegó a la conclusión que la problemática es mayor de lo que se cree.


Counting the Unknown Victims of Political Violence: The Work of the Human Rights Data Analysis Group

Ann Harrison (2012). Counting the Unknown Victims of Political Violence: The Work of the Human Rights Data Analysis Group, in Human Rights and Information Communications Technologies: Trends and Consequences of Use. © 2012 IGI Global. All rights reserved.


Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.

Romesh Silva and Jasmine Marwaha. “Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.” In JSM Proceedings, Social Statistics Section. Alexandria, VA. © 2011 American Statistical Association. All rights reserved.


Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis

Patrick Ball, Herbert F. Spirer, and Louise Spirer, eds. Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis . © 2000 American Association for the Advancement of Science. All rights reserved. Reprinted with permission. [full text] [intro] [chapters 1 2 3 4 5 67 8 9 10 11 12]


On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations

Romesh Silva. “On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations.” ASA Proceedings of the Joint Statistical Meetings, the International Biometric Society (ENAR and WNAR), the Institute of Mathematical Statistics, and the Statistical Society of Canada. August, 2002.


Different Convenience Samples, Different Stories: The Case of Sierra Leone.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate