699 results for search: %EB%82%A8%EA%B5%AC%ED%9C%B4%EA%B2%8C%ED%85%94%E3%81%ACjusobot%E3%80%81%EF%BD%830m%E2%99%99%EB%82%A8%EA%B5%AC%EA%B1%B4%EB%A7%88%E2%99%A9%EB%82%A8%EA%B5%AC%EC%97%85%EC%86%8C%E2%9C%88%EB%82%A8%EA%B5%AC%EB%A6%BD%EB%B0%A9%20%EB%82%A8%EA%B5%AC%EB%8B%AC%EB%A6%BC/feed/content/colombia/copyright
A Human Rights Statistician Finds Truth In Numbers
The tension started in the witness room. “You could feel the stress rolling off the walls in there,” Patrick Ball remembers. “I can remember realizing that this is why lawyers wear sport coats – you can’t see all the sweat on their arms and back.” He was, you could say, a little nervous to be cross-examined by Slobodan Milosevic.
How statistics lifts the fog of war in Syria
Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
The Forensic Humanitarian
International human rights work attracts activists and lawyers, diplomats and retired politicians. One of the most admired figures in the field, however, is a ponytailed statistics guru from Silicon Valley named Patrick Ball, who has spent nearly two decades fashioning a career for himself at the intersection of mathematics and murder. You could call him a forensic humanitarian.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Predictive policing tools send cops to poor/black neighborhoods
In this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.