663 results for search: %EB%A7%88%EC%BC%80%ED%8C%85%ED%8C%80%7B%E0%B4%A0%E2%9D%B6%E0%B4%A0%EF%BC%9D%E2%9D%BD%E2%9D%BD%E2%9D%BC%E2%9D%BB%EF%BC%9D%E2%9D%BD%E2%9D%BC%E2%9D%BC%E2%9D%BD%7D%EC%86%A1%EC%95%85%EB%A9%B4%EB%AC%B4%EC%9D%B8%ED%85%94%E3%84%91%EB%A7%88%EC%BC%80%ED%8C%85%E2%94%B2%ED%8C%80%E3%80%92%EC%86%A1%EC%95%85%EB%A9%B4%E9%B4%96%EB%AC%B4%EC%9D%B8%ED%85%94%E7%AC%96monosyllable/feed/rss2/copyright


Get Involved/Donate

Donating to HRDAG Thank you for your interest in making a donation to the Human Rights Data Analysis Group to help us use science to support our partners in the human rights world. You can make a donation by credit card on the Community Partners® Network for Good page. HRDAG is a "project of Community Partners," and right below  the section on payment information, you'll be able to select "Human Rights Data Analysis Group" from a drop-down menu. (On most browsers, if you use this link, HRDAG will be pre-selected on the drop-down menu.) This transaction will appear on your credit card statement as "Network for Good." If you donate by check, ...

Stay informed about our work

#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; } /* Add your own Mailchimp form style overrides in your site stylesheet or in this style block. We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */ #mc_embed_signup .mc-field-group input { display: block; width: 100%; padding: 8px 0; text-indent: 2%; color: #333 !important; } Subscribe * indicates required Email Address * First Name Last Name Organization (function($) {window.fnames = new ...

Podcast: Dr. Patrick Ball on Using Statistics to Uncover Truth

Dr. Patrick Ball recently visited the Plutopia News Network podcast for a wide-ranging, inspiring conversation about his work for the Human Rights Data Analysis Group. Patrick spoke about how he first discovered human rights work during his time in El Salvador with the Peace Brigades International.  That led to his ongoing work as a statistician and computer programmer working to assess and analyze human rights violations. He also unpacked some common statistical techniques used by researchers at Human Rights Data Analysis Group, such as multiple systems estimation, which uses multiple different datasets to gain insights into the data we don't ...

100 Women in AI Ethics

We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.

53. Kristian Lum @kldivergence


Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse

Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”


Are journalists lowballing the number of Iraqi war dead?

The Columbia Journalism Review investigates the casualty count in Iraq, more than a decade after the U.S. invasion. HRDAG executive director Patrick Ball is quoted. “IBC is very good at covering the bombs that go off in markets,” said Patrick Ball, an analyst at the Human Rights Data Analysis Group who says his whole career is to study “people being killed.” But quiet assassinations and military skirmishes away from the capital often receive little or no media attention.


La misión de contar muertos


November 1st Statement from Alejandra García at the close of her Father’s trial


Estimating Deaths in Timor-Leste


Court Sentences Two Former Policemen to 40 Years in Prison Todanoticia.com


Using Data and Statistics to Bring Down Dictators

In this story, Guerrini discusses the impact of HRDAG’s work in Guatemala, especially the trials of General José Efraín Ríos Montt and Colonel Héctor Bol de la Cruz, as well as work in El Salvador, Syria, Kosovo, and Timor-Leste. Multiple systems estimation and the perils of using raw data to draw conclusions are also addressed.
Megan Price and Patrick Ball are quoted, especially in regard to how to use raw data.
“From our perspective,” Price says, “the solution to that is both to stay very close to the data, to be very conservative in your interpretation of it and to be very clear about where the data came from, how it was collected, what its limitations might be, and to a certain extent to be skeptical about it, to ask yourself questions like, ‘What is missing from this data?’ and ‘How might that missing information change these conclusions that I’m trying to draw?’”


Predictive policing violates more than it protects

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.


Coders Bare Invasion Death Count


Data ‘hashing’ improves estimate of the number of victims in databases

But while HRDAG’s estimate relied on the painstaking efforts of human workers to carefully weed out potential duplicate records, hashing with statistical estimation proved to be faster, easier and less expensive. The researchers said hashing also had the important advantage of a sharp confidence interval: The range of error is plus or minus 1,772, or less than 1 percent of the total number of victims.

“The big win from this method is that we can quickly calculate the probable number of unique elements in a dataset with many duplicates,” said Patrick Ball, HRDAG’s director of research. “We can do a lot with this estimate.”


Evaluating gunshot detection technology

Bailey’s analysis stemmed from data we had access to as part of our ongoing collaboration with the Invisible Institute.

How a Data Tool Tracks Police Misconduct and Wandering Officers

Some police officers avoid accountability by “wandering” to another agency. HRDAG and partners created a data tool that tracks officers’ employment history.

Kosovo

During the conflict between NATO and Yugoslavia in early 1999, hundreds of thousands of people fled Kosovo, and thousands more were killed. Who were the perpetrators? Statistical analysis helped answer this question. While at the American Association for the Advancement of Science (AAAS), members of the HRDAG team wrote several reports on the conflict. With partners at ABA CEELI (American Bar Association/Central European and Eurasian Law Initiative), HRDAG submitted an expert report that was used in the trial of former Yugoslav president Slobodan Milošević at the  ICTY (International Criminal Tribunal for the Former Yugoslavia) in The  Hague, ...

Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Quantifying Injustice

“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol.  … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate