698 results for search: %EA%B4%91%EA%B3%A0%EB%AC%B8%EC%9D%98%E2%96%B7%E0%B4%A0%E2%9D%B6%E0%B4%A0%E3%85%A1%E2%9D%BD%E2%9D%BD%E2%9D%BC%E2%9D%BB%E3%85%A1%E2%9D%BD%E2%9D%BC%E2%9D%BC%E2%9D%BD%E2%96%B7%EA%B8%B0%EA%B3%84%EB%A9%B4%EA%B0%90%EC%84%B1%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%8D%EA%B4%91%EA%B3%A0%E2%94%AE%EB%AC%B8%EC%9D%98%E2%86%82%EA%B8%B0%EA%B3%84%EB%A9%B4%E7%9C%98%EA%B0%90%EC%84%B1%EB%A7%88%EC%82%AC%EC%A7%80%E5%A6%B8emendatory/feed/rss2/press-release-chad-2010jan
Casanare, Colombia
Justice Unknown, Justice Unsatisfied? Bosnian NGOs Speak about the International Criminal Tribunal for the Former Yugoslavia.
Kristen Cibelli and Tamy Guberek. “Justice Unknown, Justice Unsatisfied? Bosnian NGOs Speak about the International Criminal Tribunal for the Former Yugoslavia.” A project of Education and Public Inquiry and International Citizenship at Tufts University. December, 2000.
HRDAG at Strata Conference 2014
Big data may be reinforcing racial bias in the criminal justice system
Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.
Layers of Bias: A Unified Approach for Understanding Problems With Risk Assessment
Laurel Eckhouse, Kristian Lum, Cynthia Conti-Cook and Julie Ciccolini (2018). Layers of Bias: A Unified Approach for Understanding Problems With Risk Assessment. Criminal Justice and Behavior. November 23, 2018. © 2018 Sage Journals. All rights reserved. https://doi.org/10.1177/0093854818811379
Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project
Patrick Ball. Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project. © 1996 American Association for the Advancement of Science.
Using Data and Statistics to Bring Down Dictators
In this story, Guerrini discusses the impact of HRDAG’s work in Guatemala, especially the trials of General José Efraín Ríos Montt and Colonel Héctor Bol de la Cruz, as well as work in El Salvador, Syria, Kosovo, and Timor-Leste. Multiple systems estimation and the perils of using raw data to draw conclusions are also addressed.
Megan Price and Patrick Ball are quoted, especially in regard to how to use raw data.
“From our perspective,” Price says, “the solution to that is both to stay very close to the data, to be very conservative in your interpretation of it and to be very clear about where the data came from, how it was collected, what its limitations might be, and to a certain extent to be skeptical about it, to ask yourself questions like, ‘What is missing from this data?’ and ‘How might that missing information change these conclusions that I’m trying to draw?’”
About HRDAG
Donate
Ouster of Guatemala’s Attorney General
How Structuring Data Unburies Critical Louisiana Police Misconduct Data
Data Science Symposium at Vanderbilt
Talks
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
“El reto de la estadística es encontrar lo escondido”: experto en manejo de datos sobre el conflicto
In this interview with Colombian newspaper El Espectador, Patrick Ball is quoted as saying “la gente que no conoce de álgebra nunca debería hacer estadísticas” (people who don’t know algebra should never do statistics).
Gaza: Why is it so hard to establish the death toll?
HRDAG director of research Patrick Ball is quoted in this Nature article about how body counts are a crude measure of the war’s impact and more reliable estimates will take time to compile.
PredPol amplifies racially biased policing
HRDAG associate William Isaac is quoted in this article about how predictive policing algorithms such as PredPol exacerbate the problem of racial bias in policing.