714 results for search: %E3%80%8E%EB%8F%84%EB%B4%89%EA%B5%AC%EC%83%81%ED%99%A9%EA%B7%B9%E3%80%8F%20O6O%E3%85%A15O1%E3%85%A19997%20%EC%82%AC%EC%8B%AD%EB%8C%80%EB%8C%80%ED%99%94%EC%96%B4%ED%94%8C%20%EC%BB%A4%ED%94%8C%EC%BB%A4%EB%AE%A4%EB%8B%88%ED%8B%B0%E2%86%95%EB%AF%B8%EC%8A%A4%EB%85%80%EB%8D%B0%EC%9D%B4%ED%8C%85%E2%92%AE%EB%B0%A9%EC%95%84%EC%83%81%ED%99%A9%EA%B7%B9%20%E3%83%8D%E5%AF%9D%20bifoliate/feed/content/colombia/Co-union-violence-paper-response.pdf
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Syria’s status, the migrant crisis and talking to ISIS
In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.
Weighting for the Guatemalan National Police Archive Sample: Unusual Challenges and Problems.”
Gary M. Shapiro, Daniel R. Guzmán, Paul Zador, Tamy Guberek, Megan E. Price, Kristian Lum (2009).“Weighting for the Guatemalan National Police Archive Sample: Unusual Challenges and Problems.”In JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association.
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
One Better
The University of Michigan College of Literature, Science and the Arts profiled Patrick Ball in its fall 2016 issue of the alumni magazine. Here’s an excerpt:
Ball believes doing this laborious, difficult work makes the world a more just place because it leads to accountability.
“My part is a specific, narrow piece, which just happens to fit with the skills I have,” he says. “I don’t think that what we do is in any way the best or most important part of human rights activism. Sometimes, we are just a footnote—but we are a really good footnote.”
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
Predictive policing tools send cops to poor/black neighborhoods
In this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.
El problema del asesinato a líderes es más grave de lo que se piensa
Una investigación de Dejusticia y Human Rights Data Analysis Group asegura que en Colombia hay un subregistro de los asesinatos de líderes sociales que se han perpetrado en Colombia. Al analizar las diferentes cifras de homicidios que han publicado diversas organizaciones desde 2016, se llegó a la conclusión que la problemática es mayor de lo que se cree.
How much faith can we place in coronavirus antibody tests?
Social Science Scholars Award for HRDAG Book
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
What HBR Gets Wrong About Algorithms and Bias
“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”
Mapping Mexico’s hidden graves
When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.