691 results for search: %7B%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%7D%20www%2Ctada%2Cpw%20%20%EC%97%B0%EC%84%B1%EC%83%81%ED%99%A9%EA%B7%B9%20%EC%97%B0%EC%84%B1%EC%84%B1%EC%83%81%EB%8B%B4%E2%89%AA%EC%97%B0%EC%84%B1%EC%84%B1%EC%9D%B8%E2%97%87%EC%97%B0%EC%84%B1%EC%84%B1%EC%9D%B8%EC%89%BC%ED%84%B0%E2%92%A9%E3%83%81%E9%B8%97journeyman/feed/content/colombia/copyright


Undercover Minnesota officers suing oversight board have public LinkedIns, discipline and shootings

“In January, Invisible Institute released the data on a tool called the National Police Index, which houses data from over two dozen of POST’s peer agencies around the country. Developed by Invisible Institute, Human Rights Data Analysis Group, and Innocence & Justice Louisiana, the NPI seeks employment history data from state POST agencies to track, among other questions, the issue of so-called “wandering cops” who move from department to department after committing misconduct.” Read the article.


Tallying Syria’s War Dead

“Led by the nonprofit Human Rights Data Analysis Group (HRDAG), the process began with creating a merged dataset of “fully identified victims” to avoid double counting. Only casualties whose complete details were listed — such as their full name, date of death and the governorate they had been killed in — were included on this initial list, explained Megan Price, executive director at HRDAG. If details were missing, the victim could not be confidently cross-checked across the eight organizations’ lists, and so was excluded. This provided HRDAG and the U.N. with a minimum count of individuals whose deaths were fully documented by at least one of the different organizations. … “


The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive

HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.


The World According to Artificial Intelligence (Part 2)

The World According to Artificial Intelligence – The Bias in the Machine (Part 2)

Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.

Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”


To Combat Human Rights Abuses, California Company Looks to Computer Code


Human Rights Violations of Hissène Habré


Gaza: Why is it so hard to establish the death toll?

HRDAG director of research Patrick Ball is quoted in this Nature article about how body counts are a crude measure of the war’s impact and more reliable estimates will take time to compile.


Police transparency expands with new national database — except Michigan

Tarak Shah is quoted with regard to the National Police Index: “Police often avoid accountability by moving to another agency rather than face discipline. This tool, allowing anyone to look up and track the histories of such officers, provides an invaluable service for the human rights community in our fight against impunity.”


Martus – Paramilitary Protection for Activists


The Untold Dead of Rodrigo Duterte’s Philippines Drug War

From the article: “Based on Ball’s calculations, using our data, nearly 3,000 people could have been killed in the three areas we analyzed in the first 18 months of the drug war. That is more than three times the official police count.”


Here’s how an AI tool may flag parents with disabilities

HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”


A Data Double Take: Police Shootings

“In a recent article, social scientist Patrick Ball revisited his and Kristian Lum’s 2015 study, which made a compelling argument for the underreporting of lethal police shootings by the Bureau of Justice Statistics (BJS). Lum and Ball’s study may be old, but it bears revisiting amid debates over the American police system — debates that have featured plenty of data on the excessive use of police force. It is a useful reminder that many of the facts and figures we rely on require further verification.”


Martus: Software for Human Rights Groups


Data-Driven Efforts to Address Racial Inequality

From the article: “As we seek to advance the responsible use of data for racial injustice, we encourage individuals and organizations to support and build upon efforts already underway.” HRDAG is listed in the Data Driven Activism and Advocacy category.


All the Dead We Cannot See

Ball, a statistician, has spent the last two decades finding ways to make the silence speak. He helped pioneer the use of formal statistical modeling, and, later, machine learning—tools more often used for e-commerce or digital marketing—to measure human rights violations that weren’t recorded. In Guatemala, his analysis helped convict former dictator General Efraín Ríos Montt of genocide in 2013. It was the first time a former head of state was found guilty of the crime in his own country.


Justice by the Numbers

Wilkerson was speaking at the inaugural Conference on Fairness, Accountability, and Transparency, a gathering of academics and policymakers working to make the algorithms that govern growing swaths of our lives more just. The woman who’d invited him there was Kristian Lum, the 34-year-old lead statistician at the Human Rights Data Analysis Group, a San Francisco-based non-profit that has spent more than two decades applying advanced statistical models to expose human rights violations around the world. For the past three years, Lum has deployed those methods to tackle an issue closer to home: the growing use of machine learning tools in America’s criminal justice system.


இறுதி மூன்று நாட்களில் சரணடைந்தோரில் 500 பேர் காணாமல் ஆக்கப்பட்டுள்ளனர்


Situación de líderes sociales “es más grave de lo que se está mostrando”

Video available. La organización Dejusticia, en alianza con una institución estadounidense, asegura que los crímenes van en aumento y existe un subregistro. “Aumentó la violencia letal contra líderes sociales en 2016 y 2017 en al menos 10%”, asegura Valentina Rozo, investigadora de Dejusticia.


Truth Commissioner


Cifra de líderes sociales asesinados es más alta: Dejusticia

Contrario a lo que se puede pensar, los datos oficiales sobre líderes sociales asesinados no necesariamente corresponden a la realidad y podría haber mucha mayor victimización en las regiones golpeadas por este flagelo, según el más reciente informe del Centro de Estudios de Justicia, Derecho y Sociedad (Dejusticia) en colaboración con el Human Rights Data Analysis Group.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate