710 results for search: %E3%80%94%EC%A4%91%EB%85%84%ED%8F%B0%ED%8C%85%E3%80%95%20WWW%E0%BC%9DPAYO%E0%BC%9DPW%20%20%EB%B2%95%EC%A0%84%EB%A7%8C%EB%82%A8%ED%86%A1%20%EB%B2%95%EC%A0%84%EB%AA%A8%EC%9E%84%EC%96%B4%ED%94%8C%E2%88%83%EB%B2%95%EC%A0%84%EB%AF%B8%ED%8C%85%EC%96%B4%ED%94%8C%E2%97%86%EB%B2%95%EC%A0%84%EB%B2%88%EA%B0%9C%ED%8C%85%E2%92%AA%E3%81%B4%E9%B9%80lewdness/feed/content/colombia/copyright
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
The World According to Artificial Intelligence (Part 2)
The World According to Artificial Intelligence – The Bias in the Machine (Part 2)
Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.
Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive
HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.
Tallying Syria’s War Dead
“Led by the nonprofit Human Rights Data Analysis Group (HRDAG), the process began with creating a merged dataset of “fully identified victims” to avoid double counting. Only casualties whose complete details were listed — such as their full name, date of death and the governorate they had been killed in — were included on this initial list, explained Megan Price, executive director at HRDAG. If details were missing, the victim could not be confidently cross-checked across the eight organizations’ lists, and so was excluded. This provided HRDAG and the U.N. with a minimum count of individuals whose deaths were fully documented by at least one of the different organizations. … “
Amnesty report damns Syrian government on prison abuse
An excerpt: The “It breaks the human” report released by the human rights group Amnesty International highlights new statistics from the Human Rights Data Analysis Group, or HRDAG, an organization that uses scientific approaches to analyze human rights violations.
Using Data to Reveal Human Rights Abuses
Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.
Data and Social Good: Using Data Science to Improve Lives, Fight Injustice, and Support Democracy
In this free, downloadable report, Mike Barlow of O’Reilly Media cites several examples of how data and the work of data scientists have made a measurable impact on organizations such as DataKind, a group that connects socially minded data scientists with organizations working to address critical humanitarian issues. HRDAG—and executive director Megan Price—is one of the first organizations whose work is mentioned.
All the Dead We Cannot See
Ball, a statistician, has spent the last two decades finding ways to make the silence speak. He helped pioneer the use of formal statistical modeling, and, later, machine learning—tools more often used for e-commerce or digital marketing—to measure human rights violations that weren’t recorded. In Guatemala, his analysis helped convict former dictator General Efraín Ríos Montt of genocide in 2013. It was the first time a former head of state was found guilty of the crime in his own country.
Undercover Minnesota officers suing oversight board have public LinkedIns, discipline and shootings
“In January, Invisible Institute released the data on a tool called the National Police Index, which houses data from over two dozen of POST’s peer agencies around the country. Developed by Invisible Institute, Human Rights Data Analysis Group, and Innocence & Justice Louisiana, the NPI seeks employment history data from state POST agencies to track, among other questions, the issue of so-called “wandering cops” who move from department to department after committing misconduct.” Read the article.
Justice by the Numbers
Wilkerson was speaking at the inaugural Conference on Fairness, Accountability, and Transparency, a gathering of academics and policymakers working to make the algorithms that govern growing swaths of our lives more just. The woman who’d invited him there was Kristian Lum, the 34-year-old lead statistician at the Human Rights Data Analysis Group, a San Francisco-based non-profit that has spent more than two decades applying advanced statistical models to expose human rights violations around the world. For the past three years, Lum has deployed those methods to tackle an issue closer to home: the growing use of machine learning tools in America’s criminal justice system.
