484 results for search: %20%F0%9F%94%A5%20Buy%20Ivermectin%20Over%20The%20Counter%20For%20Humans%20Canada%20%F0%9F%94%B6%20www.Ivermectin-OTC.com%20%F0%9F%94%B6%20Order%20Stromectol%203mg%20Uk%20%EF%B8%8F%20Ivermectin%206mg%20Online%20Uk%20,%20Where%20To%20Buy%20Ivermectin%20For%20Humans%20Uk/feed/rss2/chad-photo-essay/copyright


UN Raises Estimate of Dead in Syrian Conflict to 191,000

Nick Cumming-Bruce of the New York Times writes about the UN Office of the High Commissioner of Human Right's release of HRDAG's third report on reported killings in the Syrian conflict. From the article: In its third report on Syria commissioned by the United Nations, the Human Rights Data Analysis Group identified 191,369 deaths from the start of the conflict in March 2011 to April 2014, more than double the 92,901 deaths cited in their last report, which covered the first two years of the conflict. “Tragically, it is probably an underestimate of the real total number of people killed during the first three years of this murderous conflict,” ...

One Better

The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:

Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.

“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


You Are Not So Smart: How we miss what is missing and what to do about it

On the San Francisco program, You Are Not So Smart, HRDAG director of research Megan Price talked with host David McRaney about Syria, human rights violations, and statistical analysis. The topic was survivorship bias. Megan's part in the podcast begins around Minute 27. From the YANSS blog: "Unfortunately, survivorship bias stands between you and the epiphanies you seek." You Are Not So Smart March 11, 2014 (podcast April 24, 2014) San Francisco, California Link to YANSS podcast @notsmartblog @davidmcraney Back to Talks

Media Contact

To speak with the researchers at HRDAG, please fill out the form below. You can search our Press Room by keyword or by year.

‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.

“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.


New report published on 500 Tamils missing while in Army custody

The International Truth and Justice Project and HRDAG have published a report on 500 Tamils who disappeared while in Army custody in Sri Lanka in 2009.

The report is titled “How many people disappeared on 17-19 May 2009 in Sri Lanka?” and Patrick Ball, director of research at HRDAG, is the lead author.


Using statistics to estimate the true scope of the secret killings at the end of the Sri Lankan civil war

In the last three days of the Sri Lankan civil war, as thousands of people surrendered to government authorities, hundreds of people were put on buses driven by Army officers. Many were never seen again.

In a report released today (see here), the International Truth and Justice Project for Sri Lanka and the Human Rights Data Analysis Group showed that over 500 people were disappeared on only three days — 17, 18, and 19 May.


500 Tamils disappeared in Army custody — New Study

The Sri Lankan army must explain to the families of the disappeared and missing what happened to an estimated 500 Tamils who disappeared in their custody at the war end on/around 18 May 2009, said two international NGOs who have been collating and analysing lists of names.

Sri Lanka has one of the largest numbers in the world of enforced disappearances but these 500 represent the largest number of disappearances all in one place and time in the country. For a detailed account of the process of estimating the 500 please see: “How many people disappeared on 17-19 May 2009 in Sri Lanka?” .


Unbiased algorithms can still be problematic

“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”

HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice

“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.

I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


DATNAV: New Guide to Navigate and Integrate Digital Data in Human Rights Research

DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule Krüger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.

DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule Krüger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


Calculations for the Greater Good

Rollins School of Public HealthAs executive director of the Human Rights Data Analysis Group, Megan Price uses statistics to shine the light on human rights abuses.


Mapping Mexico’s hidden graves

When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.


Benetech Celebrates Milestone; Human Rights Data Analysis Group Transitioning into Independent Organization


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate