577 results for search: %E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98zxdz8-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95trau-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98s9y9p-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95lwom/feed/rss2/copyright
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
What we’ll need to find the true COVID-19 death toll
From the article: “Intentionally inconsistent tracking can also influence the final tally, notes Megan Price, a statistician at the Human Rights Data Analysis Group. During the Iraq War, for example, officials worked to conceal mortality or to cherry pick existing data to steer the political narrative. While wars are handled differently from pandemics, Price thinks the COVID-19 data could still be at risk of this kind of manipulation.”
Unbiased algorithms can still be problematic
“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”
HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.
USA
Benetech’s Human Rights Data Analysis Group Publishes 2010 Analysis of Human Rights Violations in Five Countries,
Analysis of Uncovered Government Data from Guatemala and Chad Clarifies History and Supports Criminal Prosecutions
By Ann Harrison
The past year of research by the Benetech Human Rights Data Analysis Group (HRDAG) has supported criminal prosecutions and uncovered the truth about political violence in Guatemala, Iran, Colombia, Chad and Liberia. On today’s celebration of the 62nd anniversary of the Universal Declaration of Human Rights, HRDAG invites the international community to engage scientifically defensible methodologies that illuminate all human rights violations – including those that cannot be directly observed. 2011 will mark the 20th year that HRDAG researchers have analyzed the patterns and magnitude of human rights violations in political conflicts to determine how many of the killed and disappeared have never been accounted for – and who is most responsible.