674 results for search: %EA%B4%91%EA%B3%A0%EB%AC%B8%EC%9D%98%E2%96%B7%E0%B4%A0%E2%9D%B6%E0%B4%A0%E3%85%A1%E2%9D%BD%E2%9D%BD%E2%9D%BC%E2%9D%BB%E3%85%A1%E2%9D%BD%E2%9D%BC%E2%9D%BC%E2%9D%BD%E2%96%B7%EA%B8%B0%EA%B3%84%EB%A9%B4%EA%B0%90%EC%84%B1%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%8D%EA%B4%91%EA%B3%A0%E2%94%AE%EB%AC%B8%EC%9D%98%E2%86%82%EA%B8%B0%EA%B3%84%EB%A9%B4%E7%9C%98%EA%B0%90%EC%84%B1%EB%A7%88%EC%82%AC%EC%A7%80%E5%A6%B8emendatory/feed/rss2/press-release-chad-2010jan


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice

“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.

I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”


What happens when you look at crime by the numbers

Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”


Recognising Uncertainty in Statistics

100x100-the-engine-roomIn Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”


Hunting for Mexico’s mass graves with machine learning

“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”


Machine learning is being used to uncover the mass graves of Mexico’s missing

“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Fosas clandestinas en México manifiestan existencia de crímenes de lesa humanidad

Patrick Ball, estadístico norteamericano, colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana en una investigación sobre fosas clandestinas.


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


Data and Social Good: Using Data Science to Improve Lives, Fight Injustice, and Support Democracy

100x100-oreillymedia-logoIn this free, downloadable report, Mike Barlow of O’Reilly Media cites several examples of how data and the work of data scientists have made a measurable impact on organizations such as DataKind, a group that connects socially minded data scientists with organizations working to address critical humanitarian issues. HRDAG—and executive director Megan Price—is one of the first organizations whose work is mentioned.


Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero

El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.


Kriege und Social Media: Die Daten sind nicht perfekt

Suddeutsche Zeitung writer Mirjam Hauck interviewed HRDAG affiliate Anita Gohdes about the pitfalls of relying on social media data when interpreting violence in the context of war. This article, “Kriege und Social Media: Die Daten sind nicht perfekt,” is in German.


R programming language demands the right use case

Megan Price, director of research, is quoted in this story about the R programming language. “Serious data analysis is not something you’re going to do using a mouse and drop-down boxes,” said HRDAG’s director of research Megan Price. “It’s the kind of thing you’re going to do getting close to the data, getting close to the code and writing some of it yourself.”


Why Collecting Data In Conflict Zones Is Invaluable—And Nearly Impossible


Why It Took So Long To Update the U.N.-Sponsored Syria Death Count

In this story, Carl Bialik of FiveThirtyEight interviews HRDAG executive director Patrick Ball about the process of de-duplication, integration of databases, and machine-learning in the recent enumeration of reported casualties in Syria.
New reports of old deaths come in all the time, Ball said, making it tough to maintain a database. The duplicate-removal process means “it’s a lot like redoing the whole project each time,” he said.


Unbiased algorithms can still be problematic

“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”

HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.


Inside the Difficult, Dangerous Work of Tallying the ISIS Death Toll

HRDAG executive director Megan Price is interviewed by Mother Jones. An excerpt: “Violence can be hidden,” says Price. “ISIS has its own agenda. Sometimes that agenda is served by making public things they’ve done, and I have to assume, sometimes it’s served by hiding things they’ve done.”


Sous la dictature d’Hissène Habré, le ridicule tuait

Patrick Ball, un expert en statistiques engagé par les Chambres africaines extraordinaires, a conclu que la « mortalité dans les prisons de la DDS fut substantiellement plus élevée que celles des pires contextes du XXe siècle de prisonniers de guerre ».


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate