683 results for search: %E3%80%8C%EC%97%94%EC%A1%B0%EC%9D%B4%ED%8F%B0%ED%8C%85%E3%80%8D%20WWW_BEX_PW%20%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%9E%9C%EC%B1%97%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A6%AC%EC%96%BC%ED%8F%B0%ED%8C%85%E2%86%92%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%E2%9C%81%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%EA%B5%AC%ED%95%A8%E3%8A%8C%E3%81%86%E8%B9%9Eimparkation/feed/content/colombia/Co-union-violence-paper-response.pdf


Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.

Romesh Silva and Jasmine Marwaha. “Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.” In JSM Proceedings, Social Statistics Section. Alexandria, VA. © 2011 American Statistical Association. All rights reserved.


Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis

Patrick Ball, Herbert F. Spirer, and Louise Spirer, eds. Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis . © 2000 American Association for the Advancement of Science. All rights reserved. Reprinted with permission. [full text] [intro] [chapters 1 2 3 4 5 67 8 9 10 11 12]


On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations

Romesh Silva. “On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations.” ASA Proceedings of the Joint Statistical Meetings, the International Biometric Society (ENAR and WNAR), the Institute of Mathematical Statistics, and the Statistical Society of Canada. August, 2002.


Using Machine Learning to Help Human Rights Investigators Sift Massive Datasets

How we built a model to search hundreds of thousands of text messages from the perpetrators of a human rights crime.

Where Stats and Rights Thrive Together

Everyone I had the pleasure of interacting with enriched my summer in some way.

War and Illness Could Kill 85,000 Gazans in 6 Months

HRDAG director of research Patrick Ball is quoted in this New York Times article about a paper that models death tolls in Gaza.


Big Data Predictive Analytics Comes to Academic and Nonprofit Institutions to Fuel Innovation

"Revolution Analytics will allow HRDAG to handle bigger data sets and leverage the power of R to accomplish this goal and uncover the truth." Director of Research Megan Price is quoted. REVOLUTION ANALYTICS Press release February 4, 2014 Link to press release Back to Press Room

Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Middle East

Syria

Liberian Truth and Reconciliation Commission Data

In July 2009, The Human Rights Data Analysis Group concluded a three-year project with the Liberian Truth and Reconciliation Commission to help clarify Liberia’s violent history and hold perpetrators of human rights abuses accountable for their actions. In the course of this work, HRDAG analyzed more than 17,000 victim and witness statements collected by the Liberian Truth and Reconciliation Commission and compiled the data into a report entitled “Descriptive Statistics From Statements to the Liberian Truth and Reconciliation Commission.” Liberian TRC data and the accompanying data dictionary anonymized-statgivers.csv contains information ...

Mexico

HRDAG and our partners Data Cívica and the Iberoamericana University created a machine-learning model to predict which counties (municipios) in Mexico have the highest probability of unreported hidden graves. The predictions help advocates to bring public attention and government resources to search for the disappeared in the places where they are most likely to be found. Context For more than ten years, Mexican authorities have been discovering hidden graves (fosas clandestinas). The casualties are attributed broadly—and sometimes inaccurately—to the country’s “drug war,” but the motivations and perpetrators behind the mass murders ...

Fourth CLS Story

THis story might be about Racial Justice Act work with San Francisco Public Defender’s Office

HRDAG Names New Board Member Margot Gerritsen

Margot is a professor in the Department of Energy Resources Engineering at Stanford University, interested in computer simulation and mathematical analysis of engineering processes.

Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do

In this story about how data are transforming the nonprofit world, Patrick Ball is quoted. Here's an excerpt: "Data can have a profound impact on certain problems, but nonprofits are kidding themselves if they think the data techniques used by corporations can be applied wholesale to social problems," says Patrick Ball, head of the nonprofit Human Rights Data Analysis Group. Companies, he says, maintain complete data sets. A business knows every product it made last year, when it sold, and to whom. Charities, he says, are a different story. "If you're looking at poverty or trafficking or homicide, we don't have all the data, and we're not going to," ...

Stay informed about our work

#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; } /* Add your own Mailchimp form style overrides in your site stylesheet or in this style block. We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */ #mc_embed_signup .mc-field-group input { display: block; width: 100%; padding: 8px 0; text-indent: 2%; color: #333 !important; } Subscribe * indicates required Email Address * First Name Last Name Organization (function($) {window.fnames = new ...

The Death Toll in Syria


Media Contact

To speak with the researchers at HRDAG, please fill out the form below. You can search our Press Room by keyword or by year.

Rise of the racist robots – how AI is learning all our worst impulses

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.


Estimating Deaths in Timor-Leste


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate