707 results for search: %E3%80%8E%EB%8F%84%EB%B4%89%EA%B5%AC%EC%83%81%ED%99%A9%EA%B7%B9%E3%80%8F%20O6O%E3%85%A15O1%E3%85%A19997%20%EC%82%AC%EC%8B%AD%EB%8C%80%EB%8C%80%ED%99%94%EC%96%B4%ED%94%8C%20%EC%BB%A4%ED%94%8C%EC%BB%A4%EB%AE%A4%EB%8B%88%ED%8B%B0%E2%86%95%EB%AF%B8%EC%8A%A4%EB%85%80%EB%8D%B0%EC%9D%B4%ED%8C%85%E2%92%AE%EB%B0%A9%EC%95%84%EC%83%81%ED%99%A9%EA%B7%B9%20%E3%83%8D%E5%AF%9D%20bifoliate/feed/content/colombia/Co-union-violence-paper-response.pdf


Syria’s status, the migrant crisis and talking to ISIS

In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.


Weighting for the Guatemalan National Police Archive Sample: Unusual Challenges and Problems.”

Gary M. Shapiro, Daniel R. Guzmán, Paul Zador, Tamy Guberek, Megan E. Price, Kristian Lum (2009).“Weighting for the Guatemalan National Police Archive Sample: Unusual Challenges and Problems.”In JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association.


Here’s how an AI tool may flag parents with disabilities

HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”


Doing a Number on Violators


Analyze This!


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


One Better

The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:

Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.

“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


Los asesinatos de líderes sociales que quedan fuera de las cuentas

Una investigación de Dejusticia y Human Rights Data Analysis Group concluyó que hay un subconteo en los asesinatos de líderes sociales en Colombia. Es decir, que el aumento de estos crímenes en 2016 y 2017 podría ser incluso mayor al reportado por las organizaciones y por las cifras oficiales.


Predictive policing tools send cops to poor/black neighborhoods

100x100-boingboing-logoIn this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.


.hrdag-algo-button { margin-bottom: 25px; a { border-radius: 18px; background-color: #4f81ed; padding: 8px 18px; text-decoration: none; color: white; border: 1px solid transparent; } a:hover { color: #4f81ed; border: 1px solid #4f81ed; background-color: transparent; } } USA HRDAG’s analysis and expertise continues to deepen the national conversation about police violence and criminal legal reform in the United States. In 2015 we began by considering undocumented victims of police violence, relying on the same methodological approach we’ve tested internationa...

El problema del asesinato a líderes es más grave de lo que se piensa

Una investigación de Dejusticia y Human Rights Data Analysis Group  asegura que en Colombia hay un subregistro de los asesinatos de líderes sociales que se han perpetrado en Colombia. Al analizar las diferentes cifras de homicidios que han publicado diversas organizaciones desde 2016, se llegó a la conclusión que la problemática es mayor de lo que se cree.


Social Science Scholars Award for HRDAG Book

In March 2013, I entered a contest called the California Series in Public Anthropology International Competition, which solicits book proposals from social science scholars who write about how social scientists create meaningful change. The winners of the Series are awarded a publishing contract with the University of California Press for a book targeted to undergraduates. With the encouragement of my HRDAG colleagues Patrick Ball and Megan Price, I proposed a book about the work of HRDAG researchers entitled, Everybody Counts: How Scientists Document the Unknown Victims of Political Violence. Earlier this month, I was contacted by the Series judges ...

Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Mapping Mexico’s hidden graves

When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.


Death Toll In Syria Jumps To Nearly 93,000


Inside the Difficult, Dangerous Work of Tallying the ISIS Death Toll

HRDAG executive director Megan Price is interviewed by Mother Jones. An excerpt: “Violence can be hidden,” says Price. “ISIS has its own agenda. Sometimes that agenda is served by making public things they’ve done, and I have to assume, sometimes it’s served by hiding things they’ve done.”


Calculating US police killings using methodologies from war-crimes trials

100x100-boingboing-logoCory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”


How Machine Learning Protects Whistle-Blowers in Staten Island

People filed complaints against NYPD officers, and HRDAG went above and beyond to protect the privacy of the people who reported the offenses.

Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate