594 results for search: %EC%9D%B8%ED%84%B0%EB%84%B7%ED%99%8D%EB%B3%B4%E2%96%B2%E0%B4%A0%E2%9D%B6%E0%B4%A0%2B%E2%9D%BD%E2%9D%BD%E2%9D%BC%E2%9D%BB%2B%E2%9D%BD%E2%9D%BC%E2%9D%BC%E2%9D%BD%E2%96%B2%EC%83%81%ED%95%98%EB%8F%99%EC%95%88%EB%A7%88%E3%81%AB%EC%9D%B8%ED%84%B0%EB%84%B7%E2%94%9A%ED%99%8D%EB%B3%B4%E2%86%92%EC%83%81%ED%95%98%EB%8F%99%E5%AA%99%EC%95%88%EB%A7%88%E4%A2%8Ddesklight/feed/rss2/privacy


Locating Hidden Graves in Mexico

For more than 10 years, and with regularity, Mexican authorities have been discovering mass graves, known as fosas clandestinas, in which hundreds of bodies and piles of bones have been found. The casualties are attributed broadly to the country’s “drug war,” although the motivations and perpetrators behind the mass murders are often unknown. Recently, HRDAG collaborated with two partners in Mexico—Data Cívica and Programa de Derechos Humanos of the Universidad Iberoamericana—to model the probability of identifying a hidden grave in each county (municipio). The model uses an set of independent variables and data about graves from 2013 ...

Stephen Fienberg 1942-2016

We are saddened by the passing of Steve Fienberg yesterday in Pittsburgh, at the age of 74. He is perhaps best known around the world for bringing statistics to science and public policy and was a beloved professor at Carnegie Mellon University. At HRDAG we are in awe of and grateful for the work Steve did formalizing multiple systems estimation. His work on that front blazed a trail and essentially enabled all of our most important analytical work at the intersection of human rights and statistical science. If we are to reduce the amount of human violence in the world, the first task is to determine the scope of the violence, to know how much of ...

Welcoming Our 2019-2020 Visiting Data Science Student

Bing Wang has joined HRDAG as a Visiting Data Science Student until the summer of 2020.

Our Thoughts on the Violence in Charlottesville

This week, we join our friends and colleagues in feeling horrified by the violence in Charlottesville, Virginia. As we have for the past 26 years, we stand with the victims of violence and support human rights and dignity for all. We spend our careers observing and documenting mass political violence across the world. The demands by the so-called “alt-right” to normalize racism and social exclusion are all too familiar to us. At HRDAG, our work is always guided by the Universal Declaration of Human Rights (UDHR). We reaffirm our commitment to these principles, in particular that the “recognition of the inherent dignity and of the equal and ...

Reflections: A Meaningful Partnership between HRDAG and Benetech

I joined the Benetech Human Rights Program at essentially the same time that HRDAG did, coming to Benetech from years of analyzing data for large companies in the transportation, hospitality and retail industries. But the data that HRDAG dealt with was not like the data I was familiar with, and I was fascinated to learn about how they used the data to determine "who did what to whom." Although some of the methodologies were similar to what I had experience with in the for-profit sector, the goals and beneficiaries of the analyses were very different. At Benetech, I was initially predominantly focused on product management for Martus, a free ...

HRDAG’s Year End Review: 2019

In 2019, HRDAG aimed to count those who haven't been counted.

New publication in BIOMETRIKA

New paper in Biometrika, co-authored by HRDAG's Kristian Lum and James Johndrow: Theoretical limits of microclustering in record linkage.

Skoll World Forum 2018

Illuminating Data's Dark Side: Big data create conveniences, but we must consider who designs these tools, who benefits from them, and who is left out of the equation.

Kristian Lum in Bloomberg

The interview poses questions about Lum's focus on artificial intelligence and its impact on predictive policing and sentencing programs.

HRDAG at FAT* 2020: Pre-Trial Risk Assessment Tools

How do police officer booking decisions affect pre-trial risk assessment tools relied upon by judges?

Disrupt San Francisco TechCrunch 2018

On September 7, 2018, Kristian Lum and Patrick Ball participated in a panel at Disrupt San Francisco by TechCrunch. The talk was titled "Dismantling Algorithmic Bias." Brian Brackeen of Kairos was part of the panel as well, and the talk was moderated by TechCrunch reporter Megan Rose Dickey. From the TechCrunch website, "Disrupt is a 3-day conference focused on breaking technology news and developments with big-name thought leaders who are making waves in the industry." Video of the talk is available here, and Megan Rose Dickey's coverage is here.

Welcoming Our 2018 Data Science Fellow

Shemika Lamare has joined the HRDAG team as our new data science fellow.

Counting the Dead in Sri Lanka

ITJP and HRDAG are urging groups inside and outside Sri Lanka to share existing casualty lists.

HRDAG Names New Board Member Margot Gerritsen

Margot is a professor in the Department of Energy Resources Engineering at Stanford University, interested in computer simulation and mathematical analysis of engineering processes.

Reflections: It Began In Bogotá

It was July of 2006, I’d spent five years working at a local human rights NGO in Bogotá, and I had reached retirement age. But then a whole new world opened up for me to discover. Tamy Guberek, then HRDAG Latin America coordinator, whom I had met at the NGO, approached me about becoming part of the HRDAG Colombia team as a research/administrative assistant. Over a cup of suitably Colombian coffee, the deal was quickly "signed.” My responsibilities ranged from fundraising to translations, from support in data gathering for estimates on homicides and disappearances in various regions of Colombia to editorial support to different Benetech-HRDAG ...

Estimating the Number of SARS-CoV-2 Infections and the Impact of Mitigation Policies

This Harvard Data Science Review article uses the least unreliable source of pandemic data: reported deaths.

100 Women in AI Ethics

We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.

53. Kristian Lum @kldivergence


How Machine Learning Protects Whistle-Blowers in Staten Island

People filed complaints against NYPD officers, and HRDAG went above and beyond to protect the privacy of the people who reported the offenses.

Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse

Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”


La misión de contar muertos


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate