619 results for search: %EC%98%A8%EB%9D%BC%EC%9D%B8%ED%99%8D%EB%B3%B4%E2%99%AC%ED%86%A1adgogo%E2%99%AC%EC%A7%80%EC%82%AC%EB%A9%B4%EC%84%B1%EC%9D%B8%E3%84%93%EC%98%A8%EB%9D%BC%EC%9D%B8%E2%95%8A%ED%99%8D%EB%B3%B4%EF%BC%A0%EC%A7%80%EC%82%AC%EB%A9%B4%E6%99%B9%EC%84%B1%EC%9D%B8%E4%AF%9Apaperclip/feed/rss2/indiafaq
Data Mining on the Side of the Angels
“Data, by itself, isn’t truth.” How HRDAG uses data analysis and statistical methods to shed light on mass human rights abuses. Executive director Patrick Ball is quoted from his speech at the Chaos Communication Congress in Hamburg, Germany.
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero
El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.
Mapping Mexico’s hidden graves
When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”