700 results for search: %7B%EC%B6%A9%EC%A1%B1%ED%95%9C%20%ED%8F%B0%ED%8C%85%7D%20O6O%E2%96%AC9O2%E2%96%AC8866%20%20%EA%B3%A0%EC%84%B1%EA%B5%B081%EB%85%84%EB%8B%AD%EB%9D%A0%20%EA%B3%A0%EC%84%B1%EA%B5%B081%EB%85%84%EC%83%9D%E2%96%91%EA%B3%A0%EC%84%B1%EA%B5%B082%EB%85%84%EA%B0%9C%EB%9D%A0%E2%96%A9%EA%B3%A0%EC%84%B1%EA%B5%B082%EB%85%84%EC%83%9D%E2%9D%B7%E3%84%88%E6%8E%80dropkick/feed/content/colombia/copyright


Estimating Deaths


Death and the Mainframe: How data analysis can help document human rights atrocities


Data Dive Reveals 15,000 New Victims of Syria War


Carnegie Mellon Partners With Human Rights Data Analysis Group To Improve Syrian Casualty Reporting


Benetech Celebrates Milestone; Human Rights Data Analysis Group Transitioning into Independent Organization


‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.

“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.


Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse

Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”


Inside a Dictator’s Secret Police


Mining data on mutilations, beatings, murders


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


How data science is changing the face of human rights

100x100siliconangleOn the heels of the Women in Data Science conference, HRDAG executive director Megan Price says, “I think creativity and communication are probably the two most important skills for a data scientist to have these days.”


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


A Human Rights Statistician Finds Truth In Numbers

The tension started in the witness room. “You could feel the stress rolling off the walls in there,” Patrick Ball remembers. “I can remember realizing that this is why lawyers wear sport coats – you can’t see all the sweat on their arms and back.” He was, you could say, a little nervous to be cross-examined by Slobodan Milosevic.


How statistics caught Indonesia’s war-criminals


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Doing a Number on Violators


The Forensic Humanitarian

International human rights work attracts activists and lawyers, diplomats and retired politicians. One of the most admired figures in the field, however, is a ponytailed statistics guru from Silicon Valley named Patrick Ball, who has spent nearly two decades fashioning a career for himself at the intersection of mathematics and murder. You could call him a forensic humanitarian.


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


Former Leader of Guatemala Is Guilty of Genocide Against Mayan Group


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate