712 results for search: %ED%99%8D%EB%B3%B4%EC%A0%84%EB%AC%B8%E3%85%BF%ED%85%94%EB%A0%88adgogo%E3%85%BF%EA%B0%81%EC%82%B0%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%BF%ED%99%8D%EB%B3%B4%E2%94%BA%EC%A0%84%EB%AC%B8%E2%82%AA%EA%B0%81%EC%82%B0%E4%9D%90%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E7%AE%A0nonbeing/feed/rss2/jeffklinger/privacy


Welcoming Our New Data Scientist

We're thrilled to announce that Tarak Shah has joined our team as our new data scientist.

Counting the Dead in Sri Lanka

ITJP and HRDAG are urging groups inside and outside Sri Lanka to share existing casualty lists.

HRDAG at FAT* 2020: Pre-Trial Risk Assessment Tools

How do police officer booking decisions affect pre-trial risk assessment tools relied upon by judges?

HRDAG’s Year End Review: 2019

In 2019, HRDAG aimed to count those who haven't been counted.

Welcoming Our 2019-2020 Visiting Data Science Student

Bing Wang has joined HRDAG as a Visiting Data Science Student until the summer of 2020.

Welcoming Our 2019 Visiting Analyst

Valentina Rozo Ángel has joined our team as our new visiting analyst this fall.

HRDAG and Amnesty International: Prison Mortality in Syria

Today Amnesty International released “‘It breaks the human’: Torture, disease and death in Syria’s prisons ,” a report detailing the conditions and mortality in Syrian prisons from 2011 to 2015, including data analysis conducted by HRDAG. The report provides harrowing accounts of ill treatment of detainees in Syrian prisons since the conflict erupted in March 2011, and publishes HRDAG’s estimate of the number of killings that occurred inside the prisons. To accompany the report, HRDAG has released a technical memo that explains the methodology, sources, and implications of the findings. The HRDAG team used data from four ...

New death toll estimated in Syrian civil war

Kevin Uhrmacher of the Washington Post prepared a graph that illustrates reported deaths over time, by number of organizations reporting the deaths. Washington Post Kevin Uhrmacher August 22, 2014 Link to story on Washington Post Related blogpost (Updated Casualty Count for Syria) Back to Press Room  

Stay informed about our work

#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; } /* Add your own Mailchimp form style overrides in your site stylesheet or in this style block. We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */ #mc_embed_signup .mc-field-group input { display: block; width: 100%; padding: 8px 0; text-indent: 2%; color: #333 !important; } Subscribe * indicates required Email Address * First Name Last Name Organization (function($) {window.fnames = new ...

100 Women in AI Ethics

We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.

53. Kristian Lum @kldivergence


The Allegheny Family Screening Tool’s Overestimation of Utility and Risk

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.


Big data may be reinforcing racial bias in the criminal justice system

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.


Predictive policing violates more than it protects

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.


Welcoming Our 2021-2022 Human Rights and Data Science Intern

Larry Barrett has joined HRDAG as a Human Rights and Data Science Intern until February, 2022.

Death rate in Habre jails higher than for Japanese POWs, trial told

Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.


HRDAG Wins the Rafto Prize

The Rafto Foundation, an international human rights organization, has bestowed the 2021 Rafto Prize to HRDAG for its distinguished work defending human rights and democracy.

Estimating Deaths in Timor-Leste


Here’s how an AI tool may flag parents with disabilities

HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate