690 results for search: %ED%99%8D%EB%B3%B4%EC%A0%84%EB%AC%B8%E2%96%BD%ED%85%94%EA%B7%B8adgogo%E2%96%BD%EC%97%BC%EC%B0%BD%EB%8F%99%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E3%84%8B%ED%99%8D%EB%B3%B4%E2%94%8E%EC%A0%84%EB%AC%B8%D0%9F%EC%97%BC%EC%B0%BD%EB%8F%99%E3%A4%91%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E8%B9%B7catamite
Want to know a police officer’s job history? There’s a new tool
NPR Illinois has covered the new National Police Index, created by HRDAG’s Tarak Shah, Ayyub Ibrahim of Innocence Project, and Sam Stecklow of Invisible Institute.
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
Can We Harness AI To Fulfill The Promise Of Universal Human Rights?
The Human Rights Data Analysis Group employs AI to analyze data from conflict zones, identifying patterns of human rights abuses that might be overlooked. This assists international organizations in holding perpetrators accountable.
The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive
HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.
Tallying Syria’s War Dead
“Led by the nonprofit Human Rights Data Analysis Group (HRDAG), the process began with creating a merged dataset of “fully identified victims” to avoid double counting. Only casualties whose complete details were listed — such as their full name, date of death and the governorate they had been killed in — were included on this initial list, explained Megan Price, executive director at HRDAG. If details were missing, the victim could not be confidently cross-checked across the eight organizations’ lists, and so was excluded. This provided HRDAG and the U.N. with a minimum count of individuals whose deaths were fully documented by at least one of the different organizations. … “
Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero
El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.
Why top funders back this small human rights organization with a global reach
Eric Sears, a director at the MacArthur Foundation who leads the grantmaker’s Technology in the Public Interest program, worked at Human Rights First and Amnesty International before joining MacArthur, and has been following HRDAG’s work for years. … One of HRDAG’s strengths is the long relationships it maintains with partners around the globe. “HRDAG is notable in that it really develops deep relationships and partnerships and trust with organizations and actors in different parts of the world,” Sears said. “I think they’re unique in the sense that they don’t parachute into a situation and do a project and leave. They tend to stick with organizations and with issues over the long term, and continually help build cases around evidence and documentation to ensure that when the day comes, when accountability is possible, the facts and the evidence are there.”
New Report Raises Questions Over CPD’s Approach to Missing Persons Cases
In this video, Trina Reynolds-Tyler of Invisible Institute talks about her work with HRDAG on the missing persons project in Chicago and Beneath the Surface.
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Documenting Syrian Deaths with Data Science
Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”
War and Illness Could Kill 85,000 Gazans in 6 Months
HRDAG director of research Patrick Ball is quoted in this New York Times article about a paper that models death tolls in Gaza.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Using Data to Reveal Human Rights Abuses
Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.
Amnesty report damns Syrian government on prison abuse
An excerpt: The “It breaks the human” report released by the human rights group Amnesty International highlights new statistics from the Human Rights Data Analysis Group, or HRDAG, an organization that uses scientific approaches to analyze human rights violations.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”