306 results for search: https:/www.hab.cl/buy-aciphex-baikal-pharmacycom-rtlx/feed/rss2/privacy
Colombia
The Bigness of Big Data: samples, models, and the facts we might find when looking at data
Patrick Ball. 2015. The Bigness of Big Data: samples, models, and the facts we might find when looking at data. In The Transformation of Human Rights Fact-Finding, ed. Philip Alston and Sarah Knuckey. New York: Oxford University Press. ISBN: 9780190239497. © The Oxford University Press. All rights reserved.
Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do
In this story about how data are transforming the nonprofit world, Patrick Ball is quoted. Here’s an excerpt: “Data can have a profound impact on certain problems, but nonprofits are kidding themselves if they think the data techniques used by corporations can be applied wholesale to social problems,” says Patrick Ball, head of the nonprofit Human Rights Data Analysis Group.
Companies, he says, maintain complete data sets. A business knows every product it made last year, when it sold, and to whom. Charities, he says, are a different story.
“If you’re looking at poverty or trafficking or homicide, we don’t have all the data, and we’re not going to,” he says. “That’s why these amazing techniques that the industry people have are great in industry, but they don’t actually generalize to our space very well.”
5 Humanitarian FOSS Projects to Watch
Dave Neary described “5 Humanitarian FOSS Projects to Watch,” listing HRDAG’s work on police homicides in the U.S. and other human rights abuses in other countries.
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
Policing
PredPol amplifies racially biased policing
HRDAG associate William Isaac is quoted in this article about how predictive policing algorithms such as PredPol exacerbate the problem of racial bias in policing.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
One Better
The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:
Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.
“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Machine learning is being used to uncover the mass graves of Mexico’s missing
“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”
The causal impact of bail on case outcomes for indigent defendants in New York City
Kristian Lum, Erwin Ma and Mike Baiocchi (2017). The causal impact of bail on case outcomes for indigent defendants in New York City. Observational Studies 3 (2017) 39-64. 31 October 2017. © 2017 Institute of Mathematical Statistics.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Setting the Record Straight on Predictive Policing and Race
William Isaac and Kristian Lum (2018). Setting the Record Straight on Predictive Policing and Race. In Justice Today. 3 January 2018. © 2018 In Justice Today / Medium.
