541 results for search: %D0%92%D0%B0%D1%80%D0%BA%D1%80%D0%B0%D1%84%D1%82 %D1%81%D0%BC%D0%BE%D1%82%D1%80%D0%B5%D1%82%D1%8C %D0%BE%D0%BD%D0%BB%D0%B0%D0%B9%D0%BD smotretonlaynfilmyiserialy.ru/feed/rss2/privacy
Our Story
Seeking the Truth with Documentation
Locating Hidden Graves in Mexico
HRDAG Names New Board Members Julie Broome and Frank Schulenburg
Welcoming Our New Statistician
Welcoming our new Technical Lead
Casanare, Colombia
Frequently Asked Questions
How Many Peruvians Have Died?
Patrick Ball, Jana Asher, David Sulmont, and Daniel Manrique. “How Many Peruvians Have Died?” © 2003 American Association for the Advancement of Science.
Kriege und Social Media: Die Daten sind nicht perfekt
Suddeutsche Zeitung writer Mirjam Hauck interviewed HRDAG affiliate Anita Gohdes about the pitfalls of relying on social media data when interpreting violence in the context of war. This article, “Kriege und Social Media: Die Daten sind nicht perfekt,” is in German.
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
New Estimate Of Killings By Police Is Way Higher — And Still Too Low
Carl Bialik of 538 Politics interviews HRDAG executive director Patrick Ball in an article about the recently released Bureau of Justice Statistics report about the number of annual police killings, both reported and unreported. As Bialik writes, this is a math puzzle with real consequences.
Crean sistema para predecir fosas clandestinas en México
Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.
Improving the estimate of U.S. police killings
Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik’s article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics.
Sous la dictature d’Hissène Habré, le ridicule tuait
Patrick Ball, un expert en statistiques engagé par les Chambres africaines extraordinaires, a conclu que la « mortalité dans les prisons de la DDS fut substantiellement plus élevée que celles des pires contextes du XXe siècle de prisonniers de guerre ».