634 results for search: %ED%99%8D%EB%B3%B4%ED%8C%80%E2%98%8E%EC%B9%B4%ED%86%A1adgogo%E2%98%8E%EC%99%95%EC%8B%AD%EB%A6%AC%EC%97%AD%EC%A3%BC%EC%A0%90%E3%83%A5%ED%99%8D%EB%B3%B4%E2%94%82%ED%8C%80%E2%97%95%EC%99%95%EC%8B%AD%EB%A6%AC%EC%97%AD%E8%B1%B7%EC%A3%BC%EC%A0%90%E8%B1%88endocardium
Talks & Discussions
Making the Case: The Role of Statistics in Human Rights Reporting.
Patrick Ball. “Making the Case: The Role of Statistics in Human Rights Reporting.” Statistical Journal of the United Nations Economic Commission for Europe. 18(2-3):163-174. 2001.
Tech Note – improving LLM-driven info extraction
Selection Bias and the Statistical Patterns of Mortality in Conflict.
Megan Price and Patrick Ball. 2015. Statistical Journal of the IAOS 31: 263–272. doi: 10.3233/SJI-150899. © IOS Press and the authors. All rights reserved. Creative Commons BY-NC-SA.
South Africa
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
How a Data Tool Tracks Police Misconduct and Wandering Officers
Liberian Truth and Reconciliation Commission Data
Benetech’s Human Rights Data Analysis Group Publishes 2010 Analysis of Human Rights Violations in Five Countries,
Analysis of Uncovered Government Data from Guatemala and Chad Clarifies History and Supports Criminal Prosecutions
By Ann Harrison
The past year of research by the Benetech Human Rights Data Analysis Group (HRDAG) has supported criminal prosecutions and uncovered the truth about political violence in Guatemala, Iran, Colombia, Chad and Liberia. On today’s celebration of the 62nd anniversary of the Universal Declaration of Human Rights, HRDAG invites the international community to engage scientifically defensible methodologies that illuminate all human rights violations – including those that cannot be directly observed. 2011 will mark the 20th year that HRDAG researchers have analyzed the patterns and magnitude of human rights violations in political conflicts to determine how many of the killed and disappeared have never been accounted for – and who is most responsible.
How We Choose Projects
Talks
Trove to IPFS
Donate with Cryptocurrency
Documents of war: Understanding the Syrian Conflict
Megan Price, Anita Gohdes, and Patrick Ball. 2015. Significance 12, no. 2 (April): 14–19. doi: 10.1111/j.1740-9713.2015.00811.x. © 2015 The Royal Statistical Society. All rights reserved. [online abstract]
The Bigness of Big Data: samples, models, and the facts we might find when looking at data
Patrick Ball. 2015. The Bigness of Big Data: samples, models, and the facts we might find when looking at data. In The Transformation of Human Rights Fact-Finding, ed. Philip Alston and Sarah Knuckey. New York: Oxford University Press. ISBN: 9780190239497. © The Oxford University Press. All rights reserved.