706 results for search: %E3%80%94%EC%A4%91%EB%85%84%ED%8F%B0%ED%8C%85%E3%80%95%20WWW%E0%BC%9DPAYO%E0%BC%9DPW%20%20%EB%B2%95%EC%A0%84%EB%A7%8C%EB%82%A8%ED%86%A1%20%EB%B2%95%EC%A0%84%EB%AA%A8%EC%9E%84%EC%96%B4%ED%94%8C%E2%88%83%EB%B2%95%EC%A0%84%EB%AF%B8%ED%8C%85%EC%96%B4%ED%94%8C%E2%97%86%EB%B2%95%EC%A0%84%EB%B2%88%EA%B0%9C%ED%8C%85%E2%92%AA%E3%81%B4%E9%B9%80lewdness/feed/content/colombia/privacy
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
Inside the Difficult, Dangerous Work of Tallying the ISIS Death Toll
HRDAG executive director Megan Price is interviewed by Mother Jones. An excerpt: “Violence can be hidden,” says Price. “ISIS has its own agenda. Sometimes that agenda is served by making public things they’ve done, and I have to assume, sometimes it’s served by hiding things they’ve done.”
How statistics lifts the fog of war in Syria
Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.