656 results for search: %7B%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%7D%20WWW%E2%80%B8TADA%E2%80%B8PW%20%20%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%83%81%ED%99%A9%EA%B7%B9%20%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%83%81%EB%8B%B4%D0%B2%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%9D%B8%E2%86%95%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%9D%B8%EC%89%BC%ED%84%B0%E3%88%A6%E3%81%B6%E6%A3%BCtranscend/feed/content/colombia/copyright


Improving the estimate of U.S. police killings

Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik’s article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics.


Calculating US police killings using methodologies from war-crimes trials

100x100-boingboing-logoCory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


How data science is changing the face of human rights

100x100siliconangleOn the heels of the Women in Data Science conference, HRDAG executive director Megan Price says, “I think creativity and communication are probably the two most important skills for a data scientist to have these days.”


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Mapping Mexico’s hidden graves

When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.


Calculations for the Greater Good

Rollins School of Public HealthAs executive director of the Human Rights Data Analysis Group, Megan Price uses statistics to shine the light on human rights abuses.


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


A better statistical estimation of known Syrian war victims

Researchers from Rice University and Duke University are using the tools of statistics and data science in collaboration with Human Rights Data Analysis Group (HRDAG) to accurately and efficiently estimate the number of identified victims killed in the Syrian civil war.

Using records from four databases of people killed in the Syrian war, Chen, Duke statistician and machine learning expert Rebecca Steorts and Rice computer scientist Anshumali Shrivastava estimated there were 191,874 unique individuals documented from March 2011 to April 2014. That’s very close to the estimate of 191,369 compiled in 2014 by HRDAG, a nonprofit that helps build scientifically defensible, evidence-based arguments of human rights violations.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


New UN report counts 191,369 Syrian-war deaths — but the truth is probably much, much worse

Amanda Taub of Vox has interviewed HRDAG executive director about the UN Office of the High Commissioner of Human Right’s release of HRDAG’s third report on reported killings in the Syrian conflict.
From the article:
Patrick Ball, Executive Director of the Human Rights Data Analysis Group and one of the report’s authors, explained to me that this new report is not a statistical estimate of the number of people killed in the conflict so far. Rather, it’s an actual list of specific victims who have been identified by name, date, and location of death. (The report only tracked violent killings, not “excess mortality” deaths from from disease or hunger that the conflict is causing indirectly.)


Using statistics to estimate the true scope of the secret killings at the end of the Sri Lankan civil war

In the last three days of the Sri Lankan civil war, as thousands of people surrendered to government authorities, hundreds of people were put on buses driven by Army officers. Many were never seen again.

In a report released today (see here), the International Truth and Justice Project for Sri Lanka and the Human Rights Data Analysis Group showed that over 500 people were disappeared on only three days — 17, 18, and 19 May.


‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.

“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.


The Panic Button: High-Tech Protection for Human Rights Investigators


Patrick Ball on the Perils of Misusing Human Rights Data


Guatemalan Ex-Cops Get 40 Years for Labor Leader’s Slaying


Benetech Statistical Expert Testifies in Guatemala Disappearance Case


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate