708 results for search: %E3%80%8C%ED%95%AB%EB%AC%B4%EB%B9%84%E3%80%8D%20WWW%CD%BAHOTMOVIE%CD%BAPW%20%20%EC%B2%AD%EC%96%91%EB%8C%81%EC%9B%80%EC%A7%A4%20%EC%B2%AD%EC%96%91%EB%8C%81%EC%9B%90%EB%82%98%EC%9E%87%E2%97%8E%EC%B2%AD%EC%96%91%EB%8C%81%EC%9B%90%EB%B3%B8%C3%B2%EC%B2%AD%EC%96%91%EB%8C%81%EC%9C%A0%EC%B6%9C%E3%8A%AF%E3%82%85%E5%B5%BFslapping/feed/content/india/privacy


Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse

Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”


Using statistics to estimate the true scope of the secret killings at the end of the Sri Lankan civil war

In the last three days of the Sri Lankan civil war, as thousands of people surrendered to government authorities, hundreds of people were put on buses driven by Army officers. Many were never seen again.

In a report released today (see here), the International Truth and Justice Project for Sri Lanka and the Human Rights Data Analysis Group showed that over 500 people were disappeared on only three days — 17, 18, and 19 May.


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


Benetech Scientists Publish Analysis of Indirect Sampling Methods in the Journal of the American Medical Association


Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice

“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.

I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”


A Human Rights Statistician Finds Truth In Numbers

The tension started in the witness room. “You could feel the stress rolling off the walls in there,” Patrick Ball remembers. “I can remember realizing that this is why lawyers wear sport coats – you can’t see all the sweat on their arms and back.” He was, you could say, a little nervous to be cross-examined by Slobodan Milosevic.


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


The Atrocity Archives


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Inside a Dictator’s Secret Police


Benetech Statistical Expert Testifies in Guatemala Disappearance Case


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


How data science is changing the face of human rights

100x100siliconangleOn the heels of the Women in Data Science conference, HRDAG executive director Megan Price says, “I think creativity and communication are probably the two most important skills for a data scientist to have these days.”


Death March

A mapped representation of the scale and spread of killings in Syria. HRDAG’s director of research, Megan Price, is quoted.


Guatemalan Ex-Cops Get 40 Years for Labor Leader’s Slaying


Doing a Number on Violators


Patrick Ball on the Perils of Misusing Human Rights Data


The Panic Button: High-Tech Protection for Human Rights Investigators


How statistics caught Indonesia’s war-criminals


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate