692 results for search: %7B%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%7D%20WWW%E2%80%B8TADA%E2%80%B8PW%20%20%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%83%81%ED%99%A9%EA%B7%B9%20%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%83%81%EB%8B%B4%D0%B2%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%9D%B8%E2%86%95%EC%B2%9C%EA%B5%B0%EB%8F%99%EC%84%B1%EC%9D%B8%EC%89%BC%ED%84%B0%E3%88%A6%E3%81%B6%E6%A3%BCtranscend/feed/content/colombia/privacy
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
Big Data Predictive Analytics Comes to Academic and Nonprofit Institutions to Fuel Innovation
“Revolution Analytics will allow HRDAG to handle bigger data sets and leverage the power of R to accomplish this goal and uncover the truth.” Director of Research Megan Price is quoted
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
The World According to Artificial Intelligence (Part 2)
The World According to Artificial Intelligence – The Bias in the Machine (Part 2)
Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.
Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”
Direct procès Habré: le taux de mortalité dans les centres de détention, au menu des débats
Statisticien, Patrick Ball est à la barre ce vendredi matin. L’expert est entendu sur le taux de mortalité dans les centres de détention au Tchad sous Habré. Désigné par la chambre d’accusation, il dira avoir axé ses travaux sur des témoignages, des données venant des victimes et des documents de la DDS (Direction de la Documentation et de la Sécurité).
Using Data to Reveal Human Rights Abuses
Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.
Kriege und Social Media: Die Daten sind nicht perfekt
Suddeutsche Zeitung writer Mirjam Hauck interviewed HRDAG affiliate Anita Gohdes about the pitfalls of relying on social media data when interpreting violence in the context of war. This article, “Kriege und Social Media: Die Daten sind nicht perfekt,” is in German.
The Untold Dead of Rodrigo Duterte’s Philippines Drug War
From the article: “Based on Ball’s calculations, using our data, nearly 3,000 people could have been killed in the three areas we analyzed in the first 18 months of the drug war. That is more than three times the official police count.”
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Using Data and Statistics to Bring Down Dictators
In this story, Guerrini discusses the impact of HRDAG’s work in Guatemala, especially the trials of General José Efraín Ríos Montt and Colonel Héctor Bol de la Cruz, as well as work in El Salvador, Syria, Kosovo, and Timor-Leste. Multiple systems estimation and the perils of using raw data to draw conclusions are also addressed.
Megan Price and Patrick Ball are quoted, especially in regard to how to use raw data.
“From our perspective,” Price says, “the solution to that is both to stay very close to the data, to be very conservative in your interpretation of it and to be very clear about where the data came from, how it was collected, what its limitations might be, and to a certain extent to be skeptical about it, to ask yourself questions like, ‘What is missing from this data?’ and ‘How might that missing information change these conclusions that I’m trying to draw?’”
A Data Double Take: Police Shootings
“In a recent article, social scientist Patrick Ball revisited his and Kristian Lum’s 2015 study, which made a compelling argument for the underreporting of lethal police shootings by the Bureau of Justice Statistics (BJS). Lum and Ball’s study may be old, but it bears revisiting amid debates over the American police system — debates that have featured plenty of data on the excessive use of police force. It is a useful reminder that many of the facts and figures we rely on require further verification.”
Police transparency expands with new national database — except Michigan
Tarak Shah is quoted with regard to the National Police Index: “Police often avoid accountability by moving to another agency rather than face discipline. This tool, allowing anyone to look up and track the histories of such officers, provides an invaluable service for the human rights community in our fight against impunity.”
Justice by the Numbers
Wilkerson was speaking at the inaugural Conference on Fairness, Accountability, and Transparency, a gathering of academics and policymakers working to make the algorithms that govern growing swaths of our lives more just. The woman who’d invited him there was Kristian Lum, the 34-year-old lead statistician at the Human Rights Data Analysis Group, a San Francisco-based non-profit that has spent more than two decades applying advanced statistical models to expose human rights violations around the world. For the past three years, Lum has deployed those methods to tackle an issue closer to home: the growing use of machine learning tools in America’s criminal justice system.
