664 results for search: %E3%80%8C%ED%8A%9C%EB%8B%9D%EB%90%9C%20%ED%8F%B0%ED%8C%85%E3%80%8D%20O6O~5OO~%C6%BC469%20%20%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%8D%B0%EC%9D%B4%ED%8C%85%20%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%8F%99%ED%98%B8%ED%9A%8C%E2%98%80%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%A7%8C%EB%82%A8%D1%87%EC%9D%B4%EC%8B%AD%EB%8C%80%EB%85%80%EB%8F%99%EC%95%84%EB%A6%AC%EB%AA%A8%EC%9E%84%E3%8A%A2%E3%83%A8%E4%9E%8Edesigning/feed/content/colombia/copyright
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
On or off the record? Detecting patterns of silence about death in Guatemala’s National Police Archive
Tamy Guberek and Margaret Hedstrom (2017). On or off the record? Detecting patterns of silence about death in Guatemala’s National Police Archive. Archival Science. 9 February 2017. © Springer. DOI 10.1007/s10502-017-9274-3.
The Statistics of Genocide
Patrick Ball and Megan Price (2018). The Statistics of Genocide. Chance (special issue). February 2018. © 2018 CHANCE.
Working Where Statistics and Human Rights Meet
Robin Mejia and Megan Price (2018). Working Where Statistics and Human Rights Meet. Chance (special issue). February 2018. © 2018 CHANCE.
Documents of war: Understanding the Syrian Conflict
Megan Price, Anita Gohdes, and Patrick Ball. 2015. Significance 12, no. 2 (April): 14–19. doi: 10.1111/j.1740-9713.2015.00811.x. © 2015 The Royal Statistical Society. All rights reserved. [online abstract]
Liberian Truth and Reconciliation Commission Data
Data Mining for Good: CJA Drink + Think
Uncertainty in COVID Fatality Rates
Mexico
Reflections: Growing and Learning in Guatemala
Update of Iraq and Syria Data in New Paper
Why It Took So Long To Update the U.N.-Sponsored Syria Death Count
Featured Video
Stay informed about our work
New death toll estimated in Syrian civil war
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project
Patrick Ball. Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project. © 1996 American Association for the Advancement of Science.