679 results for search: %E3%80%8C%ED%8A%9C%EB%8B%9D%EB%90%9C%20%ED%8F%B0%ED%8C%85%E3%80%8D%20O6O~5OO~%C6%BC469%20%20%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%8D%B0%EC%9D%B4%ED%8C%85%20%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%8F%99%ED%98%B8%ED%9A%8C%E2%98%80%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%A7%8C%EB%82%A8%D1%87%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%AA%A8%EC%9E%84%E3%8A%A2%E3%83%A8%E4%9E%8Edesigning/feed/content/colombia/copyright
Update of Iraq and Syria Data in New Paper
Reflections: Growing and Learning in Guatemala
Stay informed about our work
Featured Video
Why It Took So Long To Update the U.N.-Sponsored Syria Death Count
New death toll estimated in Syrian civil war
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project
Patrick Ball. Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project. © 1996 American Association for the Advancement of Science.
A Definition of Database Design Standards for Human Rights Agencies.
Patrick Ball. “A Definition of Database Design Standards for Human Rights Agencies.” © 1994 American Association for the Advancement of Science. [pdf]
A better statistical estimation of known Syrian war victims
Researchers from Rice University and Duke University are using the tools of statistics and data science in collaboration with Human Rights Data Analysis Group (HRDAG) to accurately and efficiently estimate the number of identified victims killed in the Syrian civil war.
…
Using records from four databases of people killed in the Syrian war, Chen, Duke statistician and machine learning expert Rebecca Steorts and Rice computer scientist Anshumali Shrivastava estimated there were 191,874 unique individuals documented from March 2011 to April 2014. That’s very close to the estimate of 191,369 compiled in 2014 by HRDAG, a nonprofit that helps build scientifically defensible, evidence-based arguments of human rights violations.
Making the Case: The Role of Statistics in Human Rights Reporting.
Patrick Ball. “Making the Case: The Role of Statistics in Human Rights Reporting.” Statistical Journal of the United Nations Economic Commission for Europe. 18(2-3):163-174. 2001.
Predictive policing tools send cops to poor/black neighborhoods
In this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.
Trips to and from Guatemala
The Demography of Large-Scale Human Rights Atrocities: Integrating demographic and statistical analysis into post-conflicthistorical clarification in Timor-Leste.
Romesh Silva and Patrick Ball. “The Demography of Large-Scale Human Rights Atrocities: Integrating demographic and statistical analysis into post-conflicthistorical clarification in Timor-Leste.” Paper presented at the 2006 meetings of the Population Association of America.
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
Mapping Mexico’s hidden graves
When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.