371 results for search: 最好的Microsoft AZ-204:Developing Solutions for Microsoft Azure 最新題庫資源 - 100%合格率Newdumpspdf AZ-204 考題 🖋 進入➤ www.newdumpspdf.com ⮘搜尋“ AZ-204 ”免費下載AZ-204最新考證
Experts Greet Kosovo Memory Book
On Wednesday, February 4, in Pristina, international experts praised the Humanitarian Law Centre’s database on victims of the Kosovo conflict, the Kosovo Memory Book. HRDAG executive director Patrick Ball is quoted in the article that appeared in Balkan Transitional Justice.
Civil War in Syria: The Internet as a Weapon of War
Suddeutsche Zeitung writer Hakan Tanriverdi interviews HRDAG affiliate Anita Gohdes and writes about her work on the Syrian casualty enumeration project for the UN Office of the High Commissioner for Human Rights. This article, “Bürgerkrieg in Syrien: Das Internet als Kriegswaffe,” is in German.
Data and Social Good: Using Data Science to Improve Lives, Fight Injustice, and Support Democracy
In this free, downloadable report, Mike Barlow of O’Reilly Media cites several examples of how data and the work of data scientists have made a measurable impact on organizations such as DataKind, a group that connects socially minded data scientists with organizations working to address critical humanitarian issues. HRDAG—and executive director Megan Price—is one of the first organizations whose work is mentioned.
Download: Megan Price
Executive director Megan Price is interviewed in The New York Times’ Sunday Review, as part of a series known as “Download,” which features a biosketch of “Influencers and their interests.”
The Case Against a Golden Key
Patrick Ball (2016). The case against a golden key. Foreign Affairs. September 14, 2016. ©2016 Council on Foreign Relations, Inc. All Rights Reserved.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Inside Syria’s prisons, where an estimated 17,723 have died since 2011
Excerpt from the article: The estimate is based on reports from four organizations investigating deaths in Syria from March 15, 2011, to December, 31, 2015. From those cases, the Human Rights Data Analysis Group identified 12,270 cases with sufficient information to confirm the person was killed in detention. Using a statistical method to estimate how many victims they do not yet know about, the group came up with 17,723 cases.
Predictive policing violates more than it protects
William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.
DATNAV: New Guide to Navigate and Integrate Digital Data in Human Rights Research
DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule Krüger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.
Are journalists lowballing the number of Iraqi war dead?
The Columbia Journalism Review investigates the casualty count in Iraq, more than a decade after the U.S. invasion. HRDAG executive director Patrick Ball is quoted. “IBC is very good at covering the bombs that go off in markets,” said Patrick Ball, an analyst at the Human Rights Data Analysis Group who says his whole career is to study “people being killed.” But quiet assassinations and military skirmishes away from the capital often receive little or no media attention.
Big data may be reinforcing racial bias in the criminal justice system
Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Mapping Mexico’s hidden graves
When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.
Rise of the racist robots – how AI is learning all our worst impulses
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.