517 results for search: pills/%E2%A5%9C%F0%9F%98%BF%20Buy%20Stromectol%20%F0%9F%94%B0%20www.Ivermectin-Stromectol.com%20%F0%9F%94%B0%20Stromectol%20Price%3A%20from%20%242.85%20%F0%9F%98%BF%E2%A5%9C%20Order%20Stromectol%203%20Mg%2FIvermectin%20Buy%20Stromectol%2012%20Mg/feed/content/colombia/privacy
UN Raises Estimate of Dead in Syrian Conflict to 191,000
‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley
Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.
“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.
You Are Not So Smart: How we miss what is missing and what to do about it
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Predictive policing tools send cops to poor/black neighborhoods
In this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
One Better
The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:
Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.
“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”
Amnesty International Reports Organized Murder Of Detainees In Syrian Prison
Reports of torture and disappearances in Syria are not new. But the Amnesty International report says the magnitude and severity of abuse has “increased drastically” since 2011. Citing the Human Rights Data Analysis Group, the report says “at least 17,723 people were killed in government custody between March 2011 and December 2015, an average of 300 deaths each month.”
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.