661 results for search: %E3%80%8C%ED%98%84%EB%AA%85%ED%95%9C%20%ED%8F%B0%ED%8C%85%E3%80%8D%20O6O~5OO~%C6%BC469%20%2049%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8D%B0%EC%9D%B4%ED%8C%85%2049%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8F%99%EC%95%84%EB%A6%AC%E2%98%9C49%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8F%99%ED%98%B8%ED%9A%8C%E2%96%9349%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%A7%8C%EB%82%A8%E2%93%A5%E3%82%89%E5%BF%A6hybridity/feed/content/colombia/SV-report_2011-04-26.pdf
Karl E. Peace Award Recognizes Work of Patrick Ball
The American Statistical Association’s 2018 Karl E. Peace Award for Outstanding Statistical Contributions for the Betterment of Society recently recognized the work of leading human rights mathematician Patrick Ball of the Human Rights Data Analysis Group (HRDAG). The award is presented annually to statisticians whose exemplary statistical research is matched by the impact their work has had on the lives of people.
Established by the family of Karl E. Peace in honor of his work for the good of society, the award—announced at the Joint Statistical Meetings—is bestowed upon distinguished individual(s) who have made substantial contributions to the statistical profession, contributions that have led in direct ways to improving the human condition. Recipients will have demonstrated through their accomplishments their commitment to service for the greater good.”
This year, Ball became the 10th recipient of the award. Read more …
Big Data Predictive Analytics Comes to Academic and Nonprofit Institutions to Fuel Innovation
Predictive policing violates more than it protects
William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.
Documents of war: Understanding the Syrian Conflict
Megan Price, Anita Gohdes, and Patrick Ball. 2015. Significance 12, no. 2 (April): 14–19. doi: 10.1111/j.1740-9713.2015.00811.x. © 2015 The Royal Statistical Society. All rights reserved. [online abstract]
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
How Many People Will Get Covid-19?
A Model to Estimate SARS-CoV-2-Positive Americans
Overbooking’s Impact on Pre-Trial Risk Assessment Tools
Fourth CLS Story
HRDAG Offers New R Package – dga
Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do
How We Choose Projects
Coming soon: HRDAG 2019 Year-End Review
Guatemala CIIDH Data
HRDAG contributes to textbook Counting Civilian Casualties
Historic verdict in Guatemala—Gen.Efraín Ríos Montt found guilty
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.