698 results for search: %ED%99%8D%EB%B3%B4%EC%A0%84%EB%AC%B8%E3%85%BF%ED%85%94%EB%A0%88adgogo%E3%85%BF%EA%B0%81%EC%82%B0%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%BF%ED%99%8D%EB%B3%B4%E2%94%BA%EC%A0%84%EB%AC%B8%E2%82%AA%EA%B0%81%EC%82%B0%E4%9D%90%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E7%AE%A0nonbeing/feed/rss2/jeffklinger/privacy


Clustering and Solving the Right Problem

In our database deduplication work, we’re trying to figure out which records refer to the same person, and which other records refer to different people. We write software that looks at tens of millions of pairs of records. We calculate a model that assigns each pair of records a probability that the pair of records refers to the same person. This step is called pairwise classification. However, there may be more than just one pair of records that refer to the same person. Sometimes three, four, or more reports of the same death are recorded. So once we have all the pairs classified, we need to decide which groups of records refer to the ...

An Award for Anita Gohdes

On November 26, HRDAG colleague Anita Gohdes was awarded the German Dissertation Prize for the Social Sciences. The patron of the prize is the President of the German Parliament, Norbert Lammert, who presented Anita with the award. Anita’s dissertation, “Repression 2.0: The Internet in the War Arsenal of Modern Dictators,” investigates the role played by social media networks in modern dictatorships, such as President Assad’s regime in Syria. On one hand, Anita argues, social media can help opposition groups to organize more effectively, but on the other hand, the same networks allow regimes to monitor and manipulate the population. ...

Counting the Dead in Sri Lanka

ITJP and HRDAG are urging groups inside and outside Sri Lanka to share existing casualty lists.

Disrupt San Francisco TechCrunch 2018

On September 7, 2018, Kristian Lum and Patrick Ball participated in a panel at Disrupt San Francisco by TechCrunch. The talk was titled "Dismantling Algorithmic Bias." Brian Brackeen of Kairos was part of the panel as well, and the talk was moderated by TechCrunch reporter Megan Rose Dickey. From the TechCrunch website, "Disrupt is a 3-day conference focused on breaking technology news and developments with big-name thought leaders who are making waves in the industry." Video of the talk is available here, and Megan Rose Dickey's coverage is here.

Where Stats and Rights Thrive Together

Everyone I had the pleasure of interacting with enriched my summer in some way.

HRDAG – 25 Years and Counting

Today is a very special day for all of us at HRDAG. This is, of course, the 68th anniversary of the Universal Declaration of Human Rights—but this day also marks our 25th year of using statistical science to support the advancement of human rights. It started 25 years ago, in December 1991, in San Salvador, when Patrick Ball was invited to work with the Salvadoran Lutheran Church to design a database to keep track of human rights abuses committed by the military in El Salvador. That work soon migrated to the NGO Human Rights Commission (CDHES). Fueled by thin beer and pupusas, Patrick dove into the deep world of data from human rights testimonies, ...

New death toll estimated in Syrian civil war

Kevin Uhrmacher of the Washington Post prepared a graph that illustrates reported deaths over time, by number of organizations reporting the deaths. Washington Post Kevin Uhrmacher August 22, 2014 Link to story on Washington Post Related blogpost (Updated Casualty Count for Syria) Back to Press Room  

100 Women in AI Ethics

We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.

53. Kristian Lum @kldivergence


The Allegheny Family Screening Tool’s Overestimation of Utility and Risk

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.


Big data may be reinforcing racial bias in the criminal justice system

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.


The killings of social movement leaders and human rights defenders in Colombia 2018 – 2023: an estimate of the universe.

Valentina Rozo Ángel and Patrick Ball. 2024.  The killings of social movement leaders and human rights defenders in Colombia 2018 - 2023: an estimate of the universe. Human Rights Data Analysis Group. 18 December 2024. © HRDAG 2024. Creative Commons International license 4.0.

Valentina Rozo Ángel and Patrick Ball. 2024.  The killings of social movement leaders and human rights defenders in Colombia 2018 – 2023: an estimate of the universe. Human Rights Data Analysis Group. 18 December 2024. © HRDAG 2024. Creative Commons International license 4.0.


Predictive policing violates more than it protects

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.


Welcoming Our 2021-2022 Human Rights and Data Science Intern

Larry Barrett has joined HRDAG as a Human Rights and Data Science Intern until February, 2022.

HRDAG Wins the Rafto Prize

The Rafto Foundation, an international human rights organization, has bestowed the 2021 Rafto Prize to HRDAG for its distinguished work defending human rights and democracy.

Death rate in Habre jails higher than for Japanese POWs, trial told

Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.


Estimating Deaths in Timor-Leste


The True Dangers of AI are Closer Than We Think

William Isaac is quoted.


Quantifying Injustice

“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol.  … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate