709 results for search: %EC%95%84%EA%B8%B0%EB%A7%98%EB%AF%B8%ED%8C%85%E2%99%AA%EB%B3%B4%EA%B8%B0%ED%8F%B0%ED%8C%85%E2%80%A2O%E2%91%B9O-%E2%91%BCO%E2%91%B6-O%E2%91%BAO%E2%91%BA%E2%99%AA%20%EC%9D%B4%EB%B0%B1%EB%A7%98%EB%AF%B8%ED%8C%85%20%ED%8C%8C%EC%B6%9C%EB%B6%80%EB%AF%B8%ED%8C%85%E2%88%AE%EA%B3%A0%EC%84%B1%EB%85%80%EB%AF%B8%ED%8C%85%F0%9F%94%8B%EB%8F%99%EC%95%88%EB%AF%B8%EB%85%80%EB%AF%B8%ED%8C%85%20%E8%85%9E%E6%AD%B4completeness%EC%95%84%EA%B8%B0%EB%A7%98%EB%AF%B8%ED%8C%85/feed/content/colombia/privacy
Our Thoughts on #metoo
Clustering and Solving the Right Problem
Our Thoughts on the Violence in Charlottesville
Reflections: Some Stories Shape You
Where Stats and Rights Thrive Together
New death toll estimated in Syrian civil war
How Data Processing Uncovers Misconduct in Use of Force in Puerto Rico
Media Contact
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.
Patrick Ball (2008). “¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.” (originally in English at AAAS) Translated by Beatriz Verjerano. Palo Alto, California: Benetech.
Truth Commissioner
From the Guatemalan military to the South African apartheid police, code cruncher Patrick Ball singles out the perpetrators of political violence.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Death rate in Habre jails higher than for Japanese POWs, trial told
Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.
HRDAG Wins the Rafto Prize
Data-driven development needs both social and computer scientists
Excerpt:
Data scientists are programmers who ignore probability but like pretty graphs, said Patrick Ball, a statistician and human rights advocate who cofounded the Human Rights Data Analysis Group.
“Data is broken,” Ball said. “Anyone who thinks they’re going to use big data to solve a problem is already on the path to fantasy land.”
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.