675 results for search: %EB%A7%88%EC%BC%80%ED%8C%85%EC%A0%84%EB%AC%B8%E2%96%B6%EC%B9%B4%ED%86%A1adgogo%E2%96%B6%EA%B5%90%ED%95%98%EB%8F%99%ED%83%80%ED%88%AC%E3%84%85%EB%A7%88%EC%BC%80%ED%8C%85%E2%94%BE%EC%A0%84%EB%AC%B8%E2%99%A1%EA%B5%90%ED%95%98%EB%8F%99%E5%A5%A9%ED%83%80%ED%88%AC%E6%AF%8Econductor/feed/content/india/copyright
The Limits of Observation for Understanding Mass Violence.
Limitations of mitigating judicial bias with machine learning
Kristian Lum (2017). Limitations of mitigating judicial bias with machine learning. Nature. 26 June 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behavior. DOI 10.1038/s41562-017-0141.
Media Contact
Publications
Donate with Cryptocurrency
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Talks
Asia
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Historic verdict in Guatemala—Gen.Efraín Ríos Montt found guilty
Using Data and Statistics to Bring Down Dictators
In this story, Guerrini discusses the impact of HRDAG’s work in Guatemala, especially the trials of General José Efraín Ríos Montt and Colonel Héctor Bol de la Cruz, as well as work in El Salvador, Syria, Kosovo, and Timor-Leste. Multiple systems estimation and the perils of using raw data to draw conclusions are also addressed.
Megan Price and Patrick Ball are quoted, especially in regard to how to use raw data.
“From our perspective,” Price says, “the solution to that is both to stay very close to the data, to be very conservative in your interpretation of it and to be very clear about where the data came from, how it was collected, what its limitations might be, and to a certain extent to be skeptical about it, to ask yourself questions like, ‘What is missing from this data?’ and ‘How might that missing information change these conclusions that I’m trying to draw?’”
HRDAG Welcomes Two New Scholars
New Report Raises Questions Over CPD’s Approach to Missing Persons Cases
In this video, Trina Reynolds-Tyler of Invisible Institute talks about her work with HRDAG on the missing persons project in Chicago and Beneath the Surface.
Human Rights Violations: How Do We Begin Counting the Dead?
Gaza: Why is it so hard to establish the death toll?
HRDAG director of research Patrick Ball is quoted in this Nature article about how body counts are a crude measure of the war’s impact and more reliable estimates will take time to compile.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
“El reto de la estadística es encontrar lo escondido”: experto en manejo de datos sobre el conflicto
In this interview with Colombian newspaper El Espectador, Patrick Ball is quoted as saying “la gente que no conoce de álgebra nunca debería hacer estadísticas” (people who don’t know algebra should never do statistics).
Data-driven development needs both social and computer scientists
Excerpt:
Data scientists are programmers who ignore probability but like pretty graphs, said Patrick Ball, a statistician and human rights advocate who cofounded the Human Rights Data Analysis Group.
“Data is broken,” Ball said. “Anyone who thinks they’re going to use big data to solve a problem is already on the path to fantasy land.”