457 results for search: https:/www.hab.cl/buy-aciphex-baikal-pharmacycom-rtlx/feed/rss2/privacy
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
The Bigness of Big Data: samples, models, and the facts we might find when looking at data
Patrick Ball. 2015. The Bigness of Big Data: samples, models, and the facts we might find when looking at data. In The Transformation of Human Rights Fact-Finding, ed. Philip Alston and Sarah Knuckey. New York: Oxford University Press. ISBN: 9780190239497. © The Oxford University Press. All rights reserved.
Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do
In this story about how data are transforming the nonprofit world, Patrick Ball is quoted. Here’s an excerpt: “Data can have a profound impact on certain problems, but nonprofits are kidding themselves if they think the data techniques used by corporations can be applied wholesale to social problems,” says Patrick Ball, head of the nonprofit Human Rights Data Analysis Group.
Companies, he says, maintain complete data sets. A business knows every product it made last year, when it sold, and to whom. Charities, he says, are a different story.
“If you’re looking at poverty or trafficking or homicide, we don’t have all the data, and we’re not going to,” he says. “That’s why these amazing techniques that the industry people have are great in industry, but they don’t actually generalize to our space very well.”
5 Humanitarian FOSS Projects to Watch
Dave Neary described “5 Humanitarian FOSS Projects to Watch,” listing HRDAG’s work on police homicides in the U.S. and other human rights abuses in other countries.
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
PredPol amplifies racially biased policing
HRDAG associate William Isaac is quoted in this article about how predictive policing algorithms such as PredPol exacerbate the problem of racial bias in policing.
El científico que usa estadísticas para encontrar desaparecidos en El Salvador, Guatemala y México
Patrick Ball es un sabueso de la verdad. Ese deseo de descubrir lo que otros quieren ocultar lo ha llevado a desarrollar fórmulas matemáticas para detectar desaparecidos.
Su trabajo consiste en aplicar métodos de medición científica para comprobar violaciones masivas de derechos humanos.
One Better
The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:
Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.
“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
The causal impact of bail on case outcomes for indigent defendants in New York City
Kristian Lum, Erwin Ma and Mike Baiocchi (2017). The causal impact of bail on case outcomes for indigent defendants in New York City. Observational Studies 3 (2017) 39-64. 31 October 2017. © 2017 Institute of Mathematical Statistics.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Setting the Record Straight on Predictive Policing and Race
William Isaac and Kristian Lum (2018). Setting the Record Straight on Predictive Policing and Race. In Justice Today. 3 January 2018. © 2018 In Justice Today / Medium.
Theoretical limits of microclustering for record linkage
John E Johndrow, Kristian Lum and D B Dunson (2018). Theoretical limits of microclustering for record linkage. Biometrika. 19 March 2018. © 2018 Oxford University Press. DOI 10.1093/biomet/asy003.
Data ‘hashing’ improves estimate of the number of victims in databases
But while HRDAG’s estimate relied on the painstaking efforts of human workers to carefully weed out potential duplicate records, hashing with statistical estimation proved to be faster, easier and less expensive. The researchers said hashing also had the important advantage of a sharp confidence interval: The range of error is plus or minus 1,772, or less than 1 percent of the total number of victims.
“The big win from this method is that we can quickly calculate the probable number of unique elements in a dataset with many duplicates,” said Patrick Ball, HRDAG’s director of research. “We can do a lot with this estimate.”
Unbiased algorithms can still be problematic
“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”
HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.