569 results for search: %D0%92%D0%B0%D1%80%D0%BA%D1%80%D0%B0%D1%84%D1%82 %D1%81%D0%BC%D0%BE%D1%82%D1%80%D0%B5%D1%82%D1%8C %D0%BE%D0%BD%D0%BB%D0%B0%D0%B9%D0%BD smotretonlaynfilmyiserialy.ru/feed/rss2/tchad-photo-essay-fr


Los asesinatos de líderes sociales que quedan fuera de las cuentas

Una investigación de Dejusticia y Human Rights Data Analysis Group concluyó que hay un subconteo en los asesinatos de líderes sociales en Colombia. Es decir, que el aumento de estos crímenes en 2016 y 2017 podría ser incluso mayor al reportado por las organizaciones y por las cifras oficiales.


The World According to Artificial Intelligence (Part 2)

The World According to Artificial Intelligence – The Bias in the Machine (Part 2)

Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.

Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”


The Untold Dead of Rodrigo Duterte’s Philippines Drug War

From the article: “Based on Ball’s calculations, using our data, nearly 3,000 people could have been killed in the three areas we analyzed in the first 18 months of the drug war. That is more than three times the official police count.”


Quantifying Injustice

“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol.  … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”


Data-Driven Efforts to Address Racial Inequality

From the article: “As we seek to advance the responsible use of data for racial injustice, we encourage individuals and organizations to support and build upon efforts already underway.” HRDAG is listed in the Data Driven Activism and Advocacy category.


AI for Human Rights

From the article: “Price described the touchstone of her organization as being a tension between how truth is simultaneously discovered and obscured. HRDAG is at the intersection of this tension; they are consistently participating in science’s progressive uncovering of what is true, but they are accustomed to working in spaces where this truth is denied. Of the many responsibilities HRDAG holds in its work is that of “speaking truth to power,” said Price, “and if that’s what you’re doing, you have to know that your truth stands up to adversarial environments.”


La misión de contar muertos


What we’ll need to find the true COVID-19 death toll

From the article: “Intentionally inconsistent tracking can also influence the final tally, notes Megan Price, a statistician at the Human Rights Data Analysis Group. During the Iraq War, for example, officials worked to conceal mortality or to cherry pick existing data to steer the political narrative. While wars are handled differently from pandemics, Price thinks the COVID-19 data could still be at risk of this kind of manipulation.”


PRIO Director Henrik Urdal’s 2022 Nobel Peace Prize Shortlist

Henrik Urdal has released his final Nobel Shortlist for 2022, and HRDAG is included on it, alongside Sviatlana Tsikhanouskaya and Alexei Navalny, and others. The list highlights pro-democracy efforts, multilateral cooperation, combating religious extremism and intolerance, and the value that research and knowledge can have for promoting peace.


Unbiased algorithms can still be problematic

“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”

HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.


Situación de líderes sociales “es más grave de lo que se está mostrando”

Video available. La organización Dejusticia, en alianza con una institución estadounidense, asegura que los crímenes van en aumento y existe un subregistro. “Aumentó la violencia letal contra líderes sociales en 2016 y 2017 en al menos 10%”, asegura Valentina Rozo, investigadora de Dejusticia.


Documenting Syrian Deaths with Data Science

Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”


Calculations for the Greater Good

Rollins School of Public HealthAs executive director of the Human Rights Data Analysis Group, Megan Price uses statistics to shine the light on human rights abuses.


What happens when you look at crime by the numbers

Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


Machine learning is being used to uncover the mass graves of Mexico’s missing

“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Fosas clandestinas en México manifiestan existencia de crímenes de lesa humanidad

Patrick Ball, estadístico norteamericano, colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana en una investigación sobre fosas clandestinas.


Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero

El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.


Crean sistema para predecir fosas clandestinas en México

Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate