683 results for search: %E3%80%8C%EC%97%94%EC%A1%B0%EC%9D%B4%ED%8F%B0%ED%8C%85%E3%80%8D%20WWW_BEX_PW%20%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%9E%9C%EC%B1%97%20%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A6%AC%EC%96%BC%ED%8F%B0%ED%8C%85%E2%86%92%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%E2%9C%81%EC%82%BC%EC%84%B1%EC%A4%91%EC%95%99%EC%97%AD%EB%A7%8C%EB%82%A8%EA%B5%AC%ED%95%A8%E3%8A%8C%E3%81%86%E8%B9%9Eimparkation/feed/content/colombia/Co-union-violence-paper-response.pdf
Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.
Romesh Silva and Jasmine Marwaha. “Collecting Sensitive Human Rights Data in the Field: A Case Study from Amritsar, India.” In JSM Proceedings, Social Statistics Section. Alexandria, VA. © 2011 American Statistical Association. All rights reserved.
Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis
Patrick Ball, Herbert F. Spirer, and Louise Spirer, eds. Making the Case. Investigating Large Scale Human Rights Violations Using Information Systems and Data Analysis . © 2000 American Association for the Advancement of Science. All rights reserved. Reprinted with permission. [full text] [intro] [chapters 1 2 3 4 5 67 8 9 10 11 12]
On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations
Romesh Silva. “On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations.” ASA Proceedings of the Joint Statistical Meetings, the International Biometric Society (ENAR and WNAR), the Institute of Mathematical Statistics, and the Statistical Society of Canada. August, 2002.
Using Machine Learning to Help Human Rights Investigators Sift Massive Datasets
Where Stats and Rights Thrive Together
War and Illness Could Kill 85,000 Gazans in 6 Months
HRDAG director of research Patrick Ball is quoted in this New York Times article about a paper that models death tolls in Gaza.
Big Data Predictive Analytics Comes to Academic and Nonprofit Institutions to Fuel Innovation
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Middle East
Liberian Truth and Reconciliation Commission Data
Mexico
Fourth CLS Story
HRDAG Names New Board Member Margot Gerritsen
Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do
Stay informed about our work
Media Contact
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.