620 results for search: %EC%98%A8%EB%9D%BC%EC%9D%B8%ED%99%8D%EB%B3%B4%E2%99%AC%ED%86%A1adgogo%E2%99%AC%EC%A7%80%EC%82%AC%EB%A9%B4%EC%84%B1%EC%9D%B8%E3%84%93%EC%98%A8%EB%9D%BC%EC%9D%B8%E2%95%8A%ED%99%8D%EB%B3%B4%EF%BC%A0%EC%A7%80%EC%82%AC%EB%A9%B4%E6%99%B9%EC%84%B1%EC%9D%B8%E4%AF%9Apaperclip/feed/rss2/privacy
Benetech’s Human Rights Data Analysis Group Publishes 2010 Analysis of Human Rights Violations in Five Countries,
Analysis of Uncovered Government Data from Guatemala and Chad Clarifies History and Supports Criminal Prosecutions
By Ann Harrison
The past year of research by the Benetech Human Rights Data Analysis Group (HRDAG) has supported criminal prosecutions and uncovered the truth about political violence in Guatemala, Iran, Colombia, Chad and Liberia. On today’s celebration of the 62nd anniversary of the Universal Declaration of Human Rights, HRDAG invites the international community to engage scientifically defensible methodologies that illuminate all human rights violations – including those that cannot be directly observed. 2011 will mark the 20th year that HRDAG researchers have analyzed the patterns and magnitude of human rights violations in political conflicts to determine how many of the killed and disappeared have never been accounted for – and who is most responsible.
R programming language demands the right use case
Megan Price, director of research, is quoted in this story about the R programming language. “Serious data analysis is not something you’re going to do using a mouse and drop-down boxes,” said HRDAG’s director of research Megan Price. “It’s the kind of thing you’re going to do getting close to the data, getting close to the code and writing some of it yourself.”
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Death March
A mapped representation of the scale and spread of killings in Syria. HRDAG’s director of research, Megan Price, is quoted.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.