705 results for search: %E3%80%8C%EC%97%94%EC%A1%B0%EC%9D%B4%ED%8F%B0%ED%8C%85%E3%80%8D%20WWW_BEX_PW%20%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%9E%9C%EC%B1%97%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A6%AC%EC%96%BC%ED%8F%B0%ED%8C%85%E2%86%92%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%E2%9C%81%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%EA%B5%AC%ED%95%A8%E3%8A%8C%E3%81%86%E8%B9%9Eimparkation/feed/content/colombia/privacy
Tech Note – using LLMs for structured info extraction
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
New UN report counts 191,369 Syrian-war deaths — but the truth is probably much, much worse
Haiti
Our Copyright Policy
A better statistical estimation of known Syrian war victims
Researchers from Rice University and Duke University are using the tools of statistics and data science in collaboration with Human Rights Data Analysis Group (HRDAG) to accurately and efficiently estimate the number of identified victims killed in the Syrian civil war.
…
Using records from four databases of people killed in the Syrian war, Chen, Duke statistician and machine learning expert Rebecca Steorts and Rice computer scientist Anshumali Shrivastava estimated there were 191,874 unique individuals documented from March 2011 to April 2014. That’s very close to the estimate of 191,369 compiled in 2014 by HRDAG, a nonprofit that helps build scientifically defensible, evidence-based arguments of human rights violations.
Welcoming Our New Data Scientist
Clustering and Solving the Right Problem
Using Machine Learning to Help Human Rights Investigators Sift Massive Datasets
Where Stats and Rights Thrive Together
HRDAG – 25 Years and Counting
HRDAG and the Trial of José Efraín Ríos Montt
Una Mirada al Archivo Histórico de la Policia Nacional a Partir de un Estudio Cuantitativo
Carolina López, Beatriz Vejarano, and Megan Price. 2016. Human Rights Data Analysis Group. © 2016 HRDAG.Creative Commons BY-NC-SA.
Big Data, Selection Bias, and the Statistical Patterns of Mortality in Conflict
Megan Price and Patrick Ball (2014). SAIS Review of International Affairs © 2014 The Johns Hopkins University Press. This article first appeared in SAIS Review, Volume 34, Issue 1, Winter-Spring 2014, pages 9-20. All rights reserved.
Big Data Predictive Analytics Comes to Academic and Nonprofit Institutions to Fuel Innovation
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.