663 results for search: %7B%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%7D%20www%2Ctada%2Cpw%20%20%EC%97%B0%EC%84%B1%EC%83%81%ED%99%A9%EA%B7%B9%20%EC%97%B0%EC%84%B1%EC%84%B1%EC%83%81%EB%8B%B4%E2%89%AA%EC%97%B0%EC%84%B1%EC%84%B1%EC%9D%B8%E2%97%87%EC%97%B0%EC%84%B1%EC%84%B1%EC%9D%B8%EC%89%BC%ED%84%B0%E2%92%A9%E3%83%81%E9%B8%97journeyman/feed/content/colombia/Co-union-violence-paper-response.pdf
Are journalists lowballing the number of Iraqi war dead?
The Columbia Journalism Review investigates the casualty count in Iraq, more than a decade after the U.S. invasion. HRDAG executive director Patrick Ball is quoted. “IBC is very good at covering the bombs that go off in markets,” said Patrick Ball, an analyst at the Human Rights Data Analysis Group who says his whole career is to study “people being killed.” But quiet assassinations and military skirmishes away from the capital often receive little or no media attention.
Haiti
Guatemala CIIDH Data
South Africa
Estimating the human toll in Syria
Megan Price (2017). Estimating the human toll in Syria. Nature. 8 February 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behaviour. ISSN 2397-3374.
“El reto de la estadística es encontrar lo escondido”: experto en manejo de datos sobre el conflicto
In this interview with Colombian newspaper El Espectador, Patrick Ball is quoted as saying “la gente que no conoce de álgebra nunca debería hacer estadísticas” (people who don’t know algebra should never do statistics).
Data Collection and Documentation for Truth-Seeking and Accountability
Megan Price and Patrick Ball (2014). The Syrian Justice and Accountability Centre. © 2014 SJAC.Creative Commons BY-NC-SA.
How Structuring Data Unburies Critical Louisiana Police Misconduct Data
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice
“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.
I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.