664 results for search: %E3%80%8C%ED%8A%9C%EB%8B%9D%EB%90%9C%20%ED%8F%B0%ED%8C%85%E3%80%8D%20O6O~5OO~%C6%BC469%20%20%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%8D%B0%EC%9D%B4%ED%8C%85%20%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%8F%99%ED%98%B8%ED%9A%8C%E2%98%80%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%A7%8C%EB%82%A8%D1%87%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%AA%A8%EC%9E%84%E3%8A%A2%E3%83%A8%E4%9E%8Edesigning/feed/content/colombia/privacy


The Atrocity Archives


Sous la dictature d’Hissène Habré, le ridicule tuait

Patrick Ball, un expert en statistiques engagé par les Chambres africaines extraordinaires, a conclu que la « mortalité dans les prisons de la DDS fut substantiellement plus élevée que celles des pires contextes du XXe siècle de prisonniers de guerre ».


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


One Better

The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:

Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.

“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”


How data science is changing the face of human rights

100x100siliconangleOn the heels of the Women in Data Science conference, HRDAG executive director Megan Price says, “I think creativity and communication are probably the two most important skills for a data scientist to have these days.”


Calculating US police killings using methodologies from war-crimes trials

100x100-boingboing-logoCory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”


Improving the estimate of U.S. police killings

Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik’s article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics.


New UN report counts 191,369 Syrian-war deaths — but the truth is probably much, much worse

Amanda Taub of Vox has interviewed HRDAG executive director about the UN Office of the High Commissioner of Human Right’s release of HRDAG’s third report on reported killings in the Syrian conflict.
From the article:
Patrick Ball, Executive Director of the Human Rights Data Analysis Group and one of the report’s authors, explained to me that this new report is not a statistical estimate of the number of people killed in the conflict so far. Rather, it’s an actual list of specific victims who have been identified by name, date, and location of death. (The report only tracked violent killings, not “excess mortality” deaths from from disease or hunger that the conflict is causing indirectly.)


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


How statistics lifts the fog of war in Syria

Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”


Calculations for the Greater Good

Rollins School of Public HealthAs executive director of the Human Rights Data Analysis Group, Megan Price uses statistics to shine the light on human rights abuses.


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Data Mining on the Side of the Angels

“Data, by itself, isn’t truth.” How HRDAG uses data analysis and statistical methods to shed light on mass human rights abuses. Executive director Patrick Ball is quoted from his speech at the Chaos Communication Congress in Hamburg, Germany.


Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice

“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.

I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”


Death March

A mapped representation of the scale and spread of killings in Syria. HRDAG’s director of research, Megan Price, is quoted.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


How We Choose Projects

For more than 20 years, HRDAG has been carving out a niche in the international human rights movement. We know what we’re good at and what we’re not qualified to do. We know what quantitative questions we think are important for the community, and we know what we like to do. These preferences guide us as we consider whether to take on a project. We’re scientists, so our priorities will come as no surprise. We like to stick to science (not ideology), avoid advocacy, answer quantifiable questions, and increase our scientific understanding. While we have no hard-and-fast rules about what projects to take on, we organize our deliberation ...

Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate