661 results for search: %EB%8B%B9%EC%A7%84%EC%A7%AD%EA%B0%80%EC%82%AC%EC%9D%B4%20jusobot%EF%BC%8E%EF%BD%83%D0%BE%EF%BD%8D%20%EB%8B%B9%EC%A7%84%EC%A7%AD%EA%B0%80%EC%82%AC%EC%9D%B4%20%EB%8B%B9%EC%A7%84%ED%83%80%EC%9D%B4%20%EB%8B%B9%EC%A7%84%EC%A7%AD%EA%B0%80%EC%82%AC%EC%9D%B4%20%EB%8B%B9%EC%A7%84%EC%95%88%EB%A7%88
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
A Human Rights Statistician Finds Truth In Numbers
The tension started in the witness room. “You could feel the stress rolling off the walls in there,” Patrick Ball remembers. “I can remember realizing that this is why lawyers wear sport coats – you can’t see all the sweat on their arms and back.” He was, you could say, a little nervous to be cross-examined by Slobodan Milosevic.
How statistics lifts the fog of war in Syria
Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Death March
A mapped representation of the scale and spread of killings in Syria. HRDAG’s director of research, Megan Price, is quoted.