580 results for search: %E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98zxdz8-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95trau-%E6%80%8E%E6%A0%B7%E8%A7%A3%E5%86%B3%E5%BC%80%E6%9C%BA%E6%97%B6%E6%9B%B4%E6%96%B0%E7%9A%84%E9%97%AE%E9%A2%98s9y9p-%E6%B3%95%E6%B2%BB%E5%BB%BA%E8%AE%BE%E5%B7%A5%E4%BD%9C%E4%BC%9A%E8%AE%AE%E8%AE%B0%E5%BD%95lwom/feed/rss2/privacy
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
5 Humanitarian FOSS Projects to Watch
Dave Neary described “5 Humanitarian FOSS Projects to Watch,” listing HRDAG’s work on police homicides in the U.S. and other human rights abuses in other countries.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero
El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
USA
Mexico
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice
“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.
I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”
That Higher Count Of Police Killings May Still Be 25 Percent Too Low.
Carl Bialik of 538 Politics reports on a new HRDAG study authored by Kristian Lum and Patrick Ball regarding the Bureau of Justice Statistics report about the number of annual police killings, which was issued a few weeks ago. As Bialik writes, the HRDAG scientists extrapolated from their work in five other countries (Colombia, Guatemala, Kosovo, Sierra Leone and Syria) to estimate that the BJS study missed approximately one quarter of the total number of killings by police.
Can ‘predictive policing’ prevent crime before it happens?
HRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”
Kriege und Social Media: Die Daten sind nicht perfekt
Suddeutsche Zeitung writer Mirjam Hauck interviewed HRDAG affiliate Anita Gohdes about the pitfalls of relying on social media data when interpreting violence in the context of war. This article, “Kriege und Social Media: Die Daten sind nicht perfekt,” is in German.