680 results for search: %7B%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%7D%20WWW%E2%80%B8TADA%E2%80%B8PW%20%20%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%83%81%ED%99%A9%EA%B7%B9%20%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%83%81%EB%8B%B4%D0%B2%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%9D%B8%E2%86%95%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%9D%B8%EC%89%BC%ED%84%B0%E3%88%A6%E3%81%B6%E6%A3%BCtranscend/feed/content/colombia/privacy


Documenting Syrian Deaths with Data Science

Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”


What happens when you look at crime by the numbers

Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


Fosas clandestinas en México manifiestan existencia de crímenes de lesa humanidad

Patrick Ball, estadístico norteamericano, colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana en una investigación sobre fosas clandestinas.


Crean sistema para predecir fosas clandestinas en México

Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.


The World According to Artificial Intelligence (Part 1)

The World According to Artificial Intelligence: Targeted by Algorithm (Part 1)

The Big Picture: The World According to AI explores how artificial intelligence is being used today, and what it means to those on its receiving end.

Patrick Ball is interviewed: “Machine learning is pretty good at finding elements out of a huge pool of non-elements… But we’ll get a lot of false positives along the way.”


Cifra de líderes sociales asesinados es más alta: Dejusticia

Contrario a lo que se puede pensar, los datos oficiales sobre líderes sociales asesinados no necesariamente corresponden a la realidad y podría haber mucha mayor victimización en las regiones golpeadas por este flagelo, según el más reciente informe del Centro de Estudios de Justicia, Derecho y Sociedad (Dejusticia) en colaboración con el Human Rights Data Analysis Group.


Situación de líderes sociales “es más grave de lo que se está mostrando”

Video available. La organización Dejusticia, en alianza con una institución estadounidense, asegura que los crímenes van en aumento y existe un subregistro. “Aumentó la violencia letal contra líderes sociales en 2016 y 2017 en al menos 10%”, asegura Valentina Rozo, investigadora de Dejusticia.


இறுதி மூன்று நாட்களில் சரணடைந்தோரில் 500 பேர் காணாமல் ஆக்கப்பட்டுள்ளனர்


All the Dead We Cannot See

Ball, a statistician, has spent the last two decades finding ways to make the silence speak. He helped pioneer the use of formal statistical modeling, and, later, machine learning—tools more often used for e-commerce or digital marketing—to measure human rights violations that weren’t recorded. In Guatemala, his analysis helped convict former dictator General Efraín Ríos Montt of genocide in 2013. It was the first time a former head of state was found guilty of the crime in his own country.


Justice by the Numbers

Wilkerson was speaking at the inaugural Conference on Fairness, Accountability, and Transparency, a gathering of academics and policymakers working to make the algorithms that govern growing swaths of our lives more just. The woman who’d invited him there was Kristian Lum, the 34-year-old lead statistician at the Human Rights Data Analysis Group, a San Francisco-based non-profit that has spent more than two decades applying advanced statistical models to expose human rights violations around the world. For the past three years, Lum has deployed those methods to tackle an issue closer to home: the growing use of machine learning tools in America’s criminal justice system.


El científico que usa estadísticas para encontrar desaparecidos en El Salvador, Guatemala y México

Patrick Ball es un sabueso de la verdad. Ese deseo de descubrir lo que otros quieren ocultar lo ha llevado a desarrollar fórmulas matemáticas para detectar desaparecidos.

Su trabajo consiste en aplicar métodos de medición científica para comprobar violaciones masivas de derechos humanos.


Why Collecting Data In Conflict Zones Is Invaluable—And Nearly Impossible


Syrian civil war death toll exceeds 190,000, U.N. reports

Ayan Sheikh of PBS News Hour reports on the UN Office of the High Commissioner of Human Right’s release of HRDAG’s third report on reported killings in the Syrian conflict.
From the article:
The latest death toll figure covers the period from March 2011 to April of this year, came from the Human Rights Data Analysis Group and is the third study of its kind on Syria. The analysis group identified 191,269 deaths. Data was collected from five different sources to exclude inaccuracies and repetitions.


The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive

HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.


How Do We Know the Death Toll in Syria Is Accurate?


To Combat Human Rights Abuses, California Company Looks to Computer Code


The Invisible Crime, (pdf of English translation)


Martus – Paramilitary Protection for Activists


Martus: Software for Human Rights Groups


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate