700 results for search: %EB%A7%88%EC%BC%80%ED%8C%85%EC%A0%84%EB%AC%B8%E2%99%82%ED%85%94%EA%B7%B8adgogo%E2%99%82%EB%8B%A8%EA%B5%AC%EB%8F%99%ED%98%B8%EC%8A%A4%ED%8A%B8%EB%B0%94%E3%83%AD%EB%A7%88%EC%BC%80%ED%8C%85%E2%94%9E%EC%A0%84%EB%AC%B8%18%EB%8B%A8%EA%B5%AC%EB%8F%99%E5%B0%A3%ED%98%B8%EC%8A%A4%ED%8A%B8%EB%B0%94%E5%B0%A3interjectional/feed/rss2/copyright
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero
El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Using Data to Reveal Human Rights Abuses
Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.
Amnesty report damns Syrian government on prison abuse
An excerpt: The “It breaks the human” report released by the human rights group Amnesty International highlights new statistics from the Human Rights Data Analysis Group, or HRDAG, an organization that uses scientific approaches to analyze human rights violations.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
Data and Social Good: Using Data Science to Improve Lives, Fight Injustice, and Support Democracy
In this free, downloadable report, Mike Barlow of O’Reilly Media cites several examples of how data and the work of data scientists have made a measurable impact on organizations such as DataKind, a group that connects socially minded data scientists with organizations working to address critical humanitarian issues. HRDAG—and executive director Megan Price—is one of the first organizations whose work is mentioned.
Why top funders back this small human rights organization with a global reach
Eric Sears, a director at the MacArthur Foundation who leads the grantmaker’s Technology in the Public Interest program, worked at Human Rights First and Amnesty International before joining MacArthur, and has been following HRDAG’s work for years. … One of HRDAG’s strengths is the long relationships it maintains with partners around the globe. “HRDAG is notable in that it really develops deep relationships and partnerships and trust with organizations and actors in different parts of the world,” Sears said. “I think they’re unique in the sense that they don’t parachute into a situation and do a project and leave. They tend to stick with organizations and with issues over the long term, and continually help build cases around evidence and documentation to ensure that when the day comes, when accountability is possible, the facts and the evidence are there.”
The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive
HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.
Experts Greet Kosovo Memory Book
On Wednesday, February 4, in Pristina, international experts praised the Humanitarian Law Centre’s database on victims of the Kosovo conflict, the Kosovo Memory Book. HRDAG executive director Patrick Ball is quoted in the article that appeared in Balkan Transitional Justice.
A look at the top contenders for the 2022 Nobel Peace Prize
The Washington Post’s Paul Schemm recognized HRDAG’s work in Syria, in the category of research and activism. “HRDAG gained renown at the start of the war, when it was one of the few organizations that tried to put a number on the war’s enormous toll in Syrian lives.”
AI for Human Rights
From the article: “Price described the touchstone of her organization as being a tension between how truth is simultaneously discovered and obscured. HRDAG is at the intersection of this tension; they are consistently participating in science’s progressive uncovering of what is true, but they are accustomed to working in spaces where this truth is denied. Of the many responsibilities HRDAG holds in its work is that of “speaking truth to power,” said Price, “and if that’s what you’re doing, you have to know that your truth stands up to adversarial environments.”