714 results for search: %EB%A7%88%EC%BC%80%ED%8C%85%EC%A0%84%EB%AC%B8%E2%99%82%ED%85%94%EA%B7%B8adgogo%E2%99%82%EB%8B%A8%EA%B5%AC%EB%8F%99%ED%98%B8%EC%8A%A4%ED%8A%B8%EB%B0%94%E3%83%AD%EB%A7%88%EC%BC%80%ED%8C%85%E2%94%9E%EC%A0%84%EB%AC%B8%18%EB%8B%A8%EA%B5%AC%EB%8F%99%E5%B0%A3%ED%98%B8%EC%8A%A4%ED%8A%B8%EB%B0%94%E5%B0%A3interjectional/feed/rss2/copyright
The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive
HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.
Why top funders back this small human rights organization with a global reach
Eric Sears, a director at the MacArthur Foundation who leads the grantmaker’s Technology in the Public Interest program, worked at Human Rights First and Amnesty International before joining MacArthur, and has been following HRDAG’s work for years. … One of HRDAG’s strengths is the long relationships it maintains with partners around the globe. “HRDAG is notable in that it really develops deep relationships and partnerships and trust with organizations and actors in different parts of the world,” Sears said. “I think they’re unique in the sense that they don’t parachute into a situation and do a project and leave. They tend to stick with organizations and with issues over the long term, and continually help build cases around evidence and documentation to ensure that when the day comes, when accountability is possible, the facts and the evidence are there.”
Calculating US police killings using methodologies from war-crimes trials
Cory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”
Undercover Minnesota officers suing oversight board have public LinkedIns, discipline and shootings
“In January, Invisible Institute released the data on a tool called the National Police Index, which houses data from over two dozen of POST’s peer agencies around the country. Developed by Invisible Institute, Human Rights Data Analysis Group, and Innocence & Justice Louisiana, the NPI seeks employment history data from state POST agencies to track, among other questions, the issue of so-called “wandering cops” who move from department to department after committing misconduct.” Read the article.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Amnesty report damns Syrian government on prison abuse
An excerpt: The “It breaks the human” report released by the human rights group Amnesty International highlights new statistics from the Human Rights Data Analysis Group, or HRDAG, an organization that uses scientific approaches to analyze human rights violations.
Using Data to Reveal Human Rights Abuses
Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Data and Social Good: Using Data Science to Improve Lives, Fight Injustice, and Support Democracy
In this free, downloadable report, Mike Barlow of O’Reilly Media cites several examples of how data and the work of data scientists have made a measurable impact on organizations such as DataKind, a group that connects socially minded data scientists with organizations working to address critical humanitarian issues. HRDAG—and executive director Megan Price—is one of the first organizations whose work is mentioned.
Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero
El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.

