677 results for search: %ED%99%8D%EB%B3%B4%EC%A0%84%EB%AC%B8%E3%85%BF%ED%85%94%EB%A0%88adgogo%E3%85%BF%EA%B0%81%EC%82%B0%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E3%81%BF%ED%99%8D%EB%B3%B4%E2%94%BA%EC%A0%84%EB%AC%B8%E2%82%AA%EA%B0%81%EC%82%B0%E4%9D%90%EA%B1%B4%EC%A0%84%EB%A7%88%EC%82%AC%EC%A7%80%E7%AE%A0nonbeing/feed/rss2/jeffklinger
Inside the Difficult, Dangerous Work of Tallying the ISIS Death Toll
HRDAG executive director Megan Price is interviewed by Mother Jones. An excerpt: “Violence can be hidden,” says Price. “ISIS has its own agenda. Sometimes that agenda is served by making public things they’ve done, and I have to assume, sometimes it’s served by hiding things they’ve done.”
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Amnesty report damns Syrian government on prison abuse
An excerpt: The “It breaks the human” report released by the human rights group Amnesty International highlights new statistics from the Human Rights Data Analysis Group, or HRDAG, an organization that uses scientific approaches to analyze human rights violations.
Using Data to Reveal Human Rights Abuses
Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
“El reto de la estadística es encontrar lo escondido”: experto en manejo de datos sobre el conflicto
In this interview with Colombian newspaper El Espectador, Patrick Ball is quoted as saying “la gente que no conoce de álgebra nunca debería hacer estadísticas” (people who don’t know algebra should never do statistics).
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”