572 results for search: %E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98zxdz8-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95trau-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98s9y9p-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95lwom/feed/rss2/coreconcepts
The World According to Artificial Intelligence (Part 2)
The World According to Artificial Intelligence – The Bias in the Machine (Part 2)
Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.
Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”
Data on Kosovo – Other
Media Contact
How Data Extraction Illuminates Racial Disparities in Boston SWAT Raids
Can the Armed Conflict Become Part of Colombia’s History?
How Data Analysis Confirmed the Bias in a Family Screening Tool
.Rproj Considered Harmful
Documents of war: Understanding the Syrian Conflict
Megan Price, Anita Gohdes, and Patrick Ball. 2015. Significance 12, no. 2 (April): 14–19. doi: 10.1111/j.1740-9713.2015.00811.x. © 2015 The Royal Statistical Society. All rights reserved. [online abstract]
Analyzing patterns of violence in Colombia using more than 100 databases
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
How we go about estimating casualties in Syria—Part 1
Donate with Cryptocurrency
Syria’s celebrations muted by evidence of torture in Assad’s notorious prisons
The Human Rights Data Analysis Group, an independent scientific human rights organization based in San Francisco, has counted at least 17,723 people killed in Syrian custody from 2011 to 2015 — around 300 every week — almost certainly a vast undercount, it says.
Quantitative Research at the AHPN Guatemala
Remembering Scott Weikart
An Award for Anita Gohdes
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”