714 results for search: %7B%EC%B6%A9%EC%A1%B1%ED%95%9C%20%ED%8F%B0%ED%8C%85%7D%20O6O%E2%96%AC9O2%E2%96%AC8866%20%20%EA%B3%A0%EC%84%B1%EA%B5%B081%EB%85%84%EB%8B%AD%EB%9D%A0%20%EA%B3%A0%EC%84%B1%EA%B5%B081%EB%85%84%EC%83%9D%E2%96%91%EA%B3%A0%EC%84%B1%EA%B5%B082%EB%85%84%EA%B0%9C%EB%9D%A0%E2%96%A9%EA%B3%A0%EC%84%B1%EA%B5%B082%EB%85%84%EC%83%9D%E2%9D%B7%E3%84%88%E6%8E%80dropkick/feed/content/colombia/privacy
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
War and Illness Could Kill 85,000 Gazans in 6 Months
HRDAG director of research Patrick Ball is quoted in this New York Times article about a paper that models death tolls in Gaza.
Want to know a police officer’s job history? There’s a new tool
NPR Illinois has covered the new National Police Index, created by HRDAG’s Tarak Shah, Ayyub Ibrahim of Innocence Project, and Sam Stecklow of Invisible Institute.
Can We Harness AI To Fulfill The Promise Of Universal Human Rights?
The Human Rights Data Analysis Group employs AI to analyze data from conflict zones, identifying patterns of human rights abuses that might be overlooked. This assists international organizations in holding perpetrators accountable.
‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley
Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.
“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.
Learning to Learn: Reflections on My Time at HRDAG
How We Choose Projects
HRDAG Names New Board Member William Isaac
Data Mining for Good: CJA Drink + Think
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
How Machine Learning Protects Whistle-Blowers in Staten Island
Hat-Tip from Guatemala Judges on HRDAG Evidence
Guatemala CIIDH Data
In Solidarity
Tech Note – improving LLM-driven info extraction
Death rate in Habre jails higher than for Japanese POWs, trial told
Patrick Ball of the California-based Human Rights Data Analysis Group said he had calculated the mortality rate of political prisoners from 1985 to 1988 using reports completed by Habre’s feared secret police.
¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.
Patrick Ball (2008). “¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.” (originally in English at AAAS) Translated by Beatriz Verjerano. Palo Alto, California: Benetech.
A Definition of Database Design Standards for Human Rights Agencies.
Patrick Ball. “A Definition of Database Design Standards for Human Rights Agencies.” © 1994 American Association for the Advancement of Science. [pdf]
