655 results for search: %E3%80%8C%ED%98%84%EB%AA%85%ED%95%9C%20%ED%8F%B0%ED%8C%85%E3%80%8D%20O6O~5OO~%C6%BC469%20%2049%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8D%B0%EC%9D%B4%ED%8C%85%2049%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8F%99%EC%95%84%EB%A6%AC%E2%98%9C49%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%8F%99%ED%98%B8%ED%9A%8C%E2%96%9349%EC%82%B4%EB%82%A8%EC%84%B1%EC%84%B9%ED%8C%8C%EB%A7%8C%EB%82%A8%E2%93%A5%E3%82%89%E5%BF%A6hybridity/feed/content/colombia/copyright
Indirect Sampling to Measure Conflict Violence: Trade-offs in the Pursuit of Data That Are Good, Cheap, and Fast
Romesh Silva and Megan Price. “Indirect Sampling to Measure Conflict Violence: Trade-offs in the Pursuit of Data That Are Good, Cheap, and Fast.” Journal of the American Medical Association. 306(5):547-548. 2011. © 2011 JAMA. All rights reserved.
DATNAV: New Guide to Navigate and Integrate Digital Data in Human Rights Research
DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule Krüger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Using statistics to estimate the true scope of the secret killings at the end of the Sri Lankan civil war
In the last three days of the Sri Lankan civil war, as thousands of people surrendered to government authorities, hundreds of people were put on buses driven by Army officers. Many were never seen again.
In a report released today (see here), the International Truth and Justice Project for Sri Lanka and the Human Rights Data Analysis Group showed that over 500 people were disappeared on only three days — 17, 18, and 19 May.
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
A better statistical estimation of known Syrian war victims
Researchers from Rice University and Duke University are using the tools of statistics and data science in collaboration with Human Rights Data Analysis Group (HRDAG) to accurately and efficiently estimate the number of identified victims killed in the Syrian civil war.
…
Using records from four databases of people killed in the Syrian war, Chen, Duke statistician and machine learning expert Rebecca Steorts and Rice computer scientist Anshumali Shrivastava estimated there were 191,874 unique individuals documented from March 2011 to April 2014. That’s very close to the estimate of 191,369 compiled in 2014 by HRDAG, a nonprofit that helps build scientifically defensible, evidence-based arguments of human rights violations.
Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice
“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.
I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”
What HBR Gets Wrong About Algorithms and Bias
“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.