655 results for search: %E3%80%8C%ED%98%84%EB%AA%85%ED%95%9C%20%ED%8F%B0%ED%8C%85%E3%80%8D%20O6O~5OO~%C6%BC469%20%2049%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8D%B0%EC%9D%B4%ED%8C%85%2049%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8F%99%EC%95%84%EB%A6%AC%E2%98%9C49%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8F%99%ED%98%B8%ED%9A%8C%E2%96%9349%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%A7%8C%EB%82%A8%E2%93%A5%E3%82%89%E5%BF%A6hybridity/feed/content/colombia/privacy
Indirect Sampling to Measure Conflict Violence: Trade-offs in the Pursuit of Data That Are Good, Cheap, and Fast
Romesh Silva and Megan Price. “Indirect Sampling to Measure Conflict Violence: Trade-offs in the Pursuit of Data That Are Good, Cheap, and Fast.” Journal of the American Medical Association. 306(5):547-548. 2011. © 2011 JAMA. All rights reserved.
DATNAV: New Guide to Navigate and Integrate Digital Data in Human Rights Research
DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule Krüger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.
When It Comes to Human Rights, There Are No Online Security Shortcuts
Patrick Ball. When It Comes to Human Rights, There Are No Online Security Shortcuts, Wired op-ed, August 10, 2012. Wired.com © 2013 Condé Nast. All rights reserved.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
One Better
The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:
Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.
“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”
A Human Rights Statistician Finds Truth In Numbers
The tension started in the witness room. “You could feel the stress rolling off the walls in there,” Patrick Ball remembers. “I can remember realizing that this is why lawyers wear sport coats – you can’t see all the sweat on their arms and back.” He was, you could say, a little nervous to be cross-examined by Slobodan Milosevic.
The Forensic Humanitarian
International human rights work attracts activists and lawyers, diplomats and retired politicians. One of the most admired figures in the field, however, is a ponytailed statistics guru from Silicon Valley named Patrick Ball, who has spent nearly two decades fashioning a career for himself at the intersection of mathematics and murder. You could call him a forensic humanitarian.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”