577 results for search: %E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98zxdz8-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95trau-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98s9y9p-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95lwom/feed/rss2/privacy
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
How we go about estimating casualties in Syria—Part 1
Donate with Cryptocurrency
Syria’s celebrations muted by evidence of torture in Assad’s notorious prisons
The Human Rights Data Analysis Group, an independent scientific human rights organization based in San Francisco, has counted at least 17,723 people killed in Syrian custody from 2011 to 2015 — around 300 every week — almost certainly a vast undercount, it says.
Quantitative Research at the AHPN Guatemala
Remembering Scott Weikart
An Award for Anita Gohdes
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Machine learning is being used to uncover the mass graves of Mexico’s missing
“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”
HRDAG’s Year in Review: 2021
Killings of Social Movement Leaders in Colombia
Colombia
Donate
About HRDAG
Cuentas y mediciones de la criminalidad y de la violencia
Exploración y análisis de los datas para comprender la realidad. Patrick Ball y Michael Reed Hurtado. 2015. Forensis 16, no. 1 (July): 529-545. © 2015 Instituto Nacional de Medicina Legal y Ciencias Forenses (República de Colombia).
“El reto de la estadística es encontrar lo escondido”: experto en manejo de datos sobre el conflicto
In this interview with Colombian newspaper El Espectador, Patrick Ball is quoted as saying “la gente que no conoce de álgebra nunca debería hacer estadísticas” (people who don’t know algebra should never do statistics).
PredPol amplifies racially biased policing
HRDAG associate William Isaac is quoted in this article about how predictive policing algorithms such as PredPol exacerbate the problem of racial bias in policing.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.