662 results for search: %EB%AC%B8%EB%8D%95%EB%A7%98%EC%BB%A4%ED%94%8C%E2%99%A3%EC%84%B1%EC%9D%B8%ED%8F%B0%ED%8C%85%E2%9C%AA%C5%B4%C5%B4%C5%B4%CC%A8PANE%CC%A8P%C5%B4%E2%99%A3%20%EB%AC%B8%EB%8D%95%EB%A7%98%ED%81%B4%EB%9F%BD%20%EB%AC%B8%EB%8D%95%EB%A7%98%ED%8C%8C%ED%8A%B8%EB%84%88%E2%88%8B%EB%AC%B8%EB%8D%95%EB%A7%98%ED%8F%B0%EC%84%B9%F0%9F%A4%B0%F0%9F%8F%BE%EB%AC%B8%EB%8D%95%EB%A7%98%ED%8F%B0%EC%84%B9%EC%95%B1%20%E4%A6%8A%E4%AC%BAallocation%EB%AC%B8%EB%8D%95%EB%A7%98%EC%BB%A4%ED%94%8C
When It Comes to Human Rights, There Are No Online Security Shortcuts
Patrick Ball. When It Comes to Human Rights, There Are No Online Security Shortcuts, Wired op-ed, August 10, 2012. Wired.com © 2013 Condé Nast. All rights reserved.
Foundation of Human Rights Statistics in Sierra Leone
Richard Conibere (2004). Foundation of Human Rights Statistics in Sierra Leone (abstr.), Joint Statistical Meetings. Toronto, Canada.
Indirect Sampling to Measure Conflict Violence: Trade-offs in the Pursuit of Data That Are Good, Cheap, and Fast
Romesh Silva and Megan Price. “Indirect Sampling to Measure Conflict Violence: Trade-offs in the Pursuit of Data That Are Good, Cheap, and Fast.” Journal of the American Medical Association. 306(5):547-548. 2011. © 2011 JAMA. All rights reserved.
Violence in Blue
Patrick Ball. 2016. Granta 134: 4 March 2016. © Granta Publications. All rights reserved.
Welcome!
Liberian Truth and Reconciliation Commission Data
‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley
Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.
“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
Using statistics to estimate the true scope of the secret killings at the end of the Sri Lankan civil war
In the last three days of the Sri Lankan civil war, as thousands of people surrendered to government authorities, hundreds of people were put on buses driven by Army officers. Many were never seen again.
In a report released today (see here), the International Truth and Justice Project for Sri Lanka and the Human Rights Data Analysis Group showed that over 500 people were disappeared on only three days — 17, 18, and 19 May.
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”