660 results for search: %E5%A4%A7%E5%85%AC%E5%8F%B8%E7%9A%84%E4%BC%98%E5%8A%BF%E5%92%8C%E5%8A%A3%E5%8A%BF-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E5%90%8C%E6%B2%BB%E6%AF%94%E5%85%89%E7%BB%AA%E5%A4%A7%E5%87%A0%E5%B2%81-%E5%A4%A7%E5%85%AC%E5%8F%B8%E7%9A%84%E4%BC%98%E5%8A%BF%E5%92%8C%E5%8A%A3%E5%8A%BFzo9xn-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E5%90%8C%E6%B2%BB%E6%AF%94%E5%85%89%E7%BB%AA%E5%A4%A7%E5%87%A0%E5%B2%81p2dn-%E5%A4%A7%E5%85%AC%E5%8F%B8%E7%9A%84%E4%BC%98%E5%8A%BF%E5%92%8C%E5%8A%A3%E5%8A%BFblgpy-%E5%90%8C%E6%B2%BB%E6%AF%94%E5%85%89%E7%BB%AA%E5%A4%A7%E5%87%A0%E5%B2%81r26q/feed/rss2/press-release-tchad-2010jan-fr
Patrick Ball wins the Karl E. Peace Award
HRDAG Retreat 2018
When Data Doesn’t Tell the Whole Story
Reflections: A Simple Plan
Predictive Policing Reinforces Police Bias
Guatemala: The Secret Files
Guatemala is still plagued by urban crime, but it is peaceful now compared to the decades of bloody civil war that convulsed the small Central American country. As he arrives in the capital, Guatemala City, FRONTLINE/World reporter Clark Boyd recalls, “When the fighting ended in the 1990s, many here wanted to move on, burying the secrets of the war along with hundreds of thousands of the dead and disappeared. But then, in July 2005, the past thundered back.”
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
How Data Processing Uncovers Misconduct in Use of Force in Puerto Rico
Liberia 2009 – Coding Testimony to Determine Accountability for War Crimes
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
Open Source Summit 2018
SermonNew death toll estimated in Syrian civil war
Kevin Uhrmacher of the Washington Post prepared a graph that illustrates reported deaths over time, by number of organizations reporting the deaths.
Procès Hissène Habré : Le statisticien fait état d’un taux de mortalité de 2,37% par jour
Les auditions d’experts se poursuivent au palais de justice de Dakar sur le procès de l’ex-président tchadien Hissène Habré. Hier, c’était au tour de Patrick Ball, seul inscrit au rôle, commis par la chambre d’accusation de N’Djamena pour dresser les statistiques sur le taux de mortalité dans les centres de détention.
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
A Comparison of Marginal and Conditional Models for Capture–Recapture Data with Application to Human Rights Violations Data
Shira Mitchell, Al Ozonoff, Alan Zaslavsky, Bethany Hedt-Gauthier, Kristian Lum and Brent Coull (2013). A Comparison of Marginal and Conditional Models for Capture-Recapture Data with Application to Human Rights Violations Data. Biometrics, Volume 69, Issue 4, pages 1022–1032, December 2013. © 2013, The International Biometric Society. DOI: 10.1111/biom.12089.
RustConf 2019, and systems programming as a data scientist
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.