546 results for search: pills/%E2%A5%9C%F0%9F%98%BF%20Buy%20Stromectol%20%F0%9F%94%B0%20www.Ivermectin-Stromectol.com%20%F0%9F%94%B0%20Stromectol%20Price%3A%20from%20%242.85%20%F0%9F%98%BF%E2%A5%9C%20Order%20Stromectol%203%20Mg%2FIvermectin%20Buy%20Stromectol%2012%20Mg/feed/content/colombia/copyright
The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Predictive policing tools send cops to poor/black neighborhoods
In this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Amnesty International Reports Organized Murder Of Detainees In Syrian Prison
Reports of torture and disappearances in Syria are not new. But the Amnesty International report says the magnitude and severity of abuse has “increased drastically” since 2011. Citing the Human Rights Data Analysis Group, the report says “at least 17,723 people were killed in government custody between March 2011 and December 2015, an average of 300 deaths each month.”
One Better
The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:
Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.
“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


