704 results for search: %ED%8F%B0%EC%84%B9%ED%95%A0%EB%85%80%EC%95%BC%ED%95%9C%EB%8C%80%ED%99%94%E2%97%86%EB%AF%B8%EC%8A%A4%ED%8F%B0%ED%8C%85%E3%85%A1%C6%9C%C6%9C%C6%9C_BOYO_P%C6%9C%E2%97%86%20%ED%8F%B0%EC%84%B9%ED%95%A0%EB%85%80%EC%95%BC%ED%95%9C%EA%B1%B0%20%ED%8F%B0%EC%84%B9%ED%95%A0%EB%85%80%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%C2%AE%ED%8F%B0%EC%84%B9%ED%95%A0%EB%85%80%EC%95%A0%EC%9D%B8%EB%8C%80%ED%96%89%F0%9F%91%A8%F0%9F%8F%BE%E2%80%8D%F0%9F%A4%9D%E2%80%8D%F0%9F%91%A8%F0%9F%8F%BC%ED%8F%B0%EC%84%B9%ED%95%A0%EB%85%80%EC%95%A0%EC%9D%B8%EA%B5%AC%ED%95%98%EA%B8%B0%20%E4%A4%93%E8%96%BAschedule%ED%8F%B0%EC%84%B9%ED%95%A0%EB%85%80%EC%95%BC%ED%95%9C%EB%8C%80%ED%99%94/feed/content/colombia/privacy
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Predictive policing tools send cops to poor/black neighborhoods
In this post, Cory Doctorow writes about the Significance article co-authored by Kristian Lum and William Isaac.
Trips to and from Guatemala
Welcome!
Social Science Scholars Award for HRDAG Book
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
What HBR Gets Wrong About Algorithms and Bias
“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”
Mapping Mexico’s hidden graves
When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.
Perú
In Syria, Uncovering the Truth Behind a Number
Huffington Post Politics writer Matt Easton interviews Patrick Ball, executive director of HRDAG, about the latest enumeration of killings in Syria. As selection bias is increasing, it becomes harder to see it: we have the “appearance of perfect knowledge, when in fact the shape of that knowledge has not changed that much,” says Patrick. “Technology is not a substitute for science.”
Liberian Truth and Reconciliation Commission Data
New Report Raises Questions Over CPD’s Approach to Missing Persons Cases
In this video, Trina Reynolds-Tyler of Invisible Institute talks about her work with HRDAG on the missing persons project in Chicago and Beneath the Surface.
War and Illness Could Kill 85,000 Gazans in 6 Months
HRDAG director of research Patrick Ball is quoted in this New York Times article about a paper that models death tolls in Gaza.
Want to know a police officer’s job history? There’s a new tool
NPR Illinois has covered the new National Police Index, created by HRDAG’s Tarak Shah, Ayyub Ibrahim of Innocence Project, and Sam Stecklow of Invisible Institute.
Can We Harness AI To Fulfill The Promise Of Universal Human Rights?
The Human Rights Data Analysis Group employs AI to analyze data from conflict zones, identifying patterns of human rights abuses that might be overlooked. This assists international organizations in holding perpetrators accountable.
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence