705 results for search: %E3%80%8E%EB%8F%84%EB%B4%89%EA%B5%AC%EC%83%81%ED%99%A9%EA%B7%B9%E3%80%8F%20O6O%E3%85%A15O1%E3%85%A19997%20%EC%82%AC%EC%8B%AD%EB%8C%80%EB%8C%80%ED%99%94%EC%96%B4%ED%94%8C%20%EC%BB%A4%ED%94%8C%EC%BB%A4%EB%AE%A4%EB%8B%88%ED%8B%B0%E2%86%95%EB%AF%B8%EC%8A%A4%EB%85%80%EB%8D%B0%EC%9D%B4%ED%8C%85%E2%92%AE%EB%B0%A9%EC%95%84%EC%83%81%ED%99%A9%EA%B7%B9%20%E3%83%8D%E5%AF%9D%20bifoliate/feed/content/colombia/copyright
Syria’s status, the migrant crisis and talking to ISIS
In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.
Testimonials
Making the Case: The Role of Statistics in Human Rights Reporting.
Patrick Ball. “Making the Case: The Role of Statistics in Human Rights Reporting.” Statistical Journal of the United Nations Economic Commission for Europe. 18(2-3):163-174. 2001.
Trips to and from Guatemala
The Demography of Large-Scale Human Rights Atrocities: Integrating demographic and statistical analysis into post-conflicthistorical clarification in Timor-Leste.
Romesh Silva and Patrick Ball. “The Demography of Large-Scale Human Rights Atrocities: Integrating demographic and statistical analysis into post-conflicthistorical clarification in Timor-Leste.” Paper presented at the 2006 meetings of the Population Association of America.
Social Science Scholars Award for HRDAG Book
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Liberian Truth and Reconciliation Commission Data
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
The Forensic Humanitarian
International human rights work attracts activists and lawyers, diplomats and retired politicians. One of the most admired figures in the field, however, is a ponytailed statistics guru from Silicon Valley named Patrick Ball, who has spent nearly two decades fashioning a career for himself at the intersection of mathematics and murder. You could call him a forensic humanitarian.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”