545 results for search: %D0%92%D0%B0%D1%80%D0%BA%D1%80%D0%B0%D1%84%D1%82 %D1%81%D0%BC%D0%BE%D1%82%D1%80%D0%B5%D1%82%D1%8C %D0%BE%D0%BD%D0%BB%D0%B0%D0%B9%D0%BD smotretonlaynfilmyiserialy.ru/feed/rss2/privacy


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Documenting Syrian Deaths with Data Science

Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero

El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.


Fosas clandestinas en México manifiestan existencia de crímenes de lesa humanidad

Patrick Ball, estadístico norteamericano, colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana en una investigación sobre fosas clandestinas.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Machine learning is being used to uncover the mass graves of Mexico’s missing

“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”


Hunting for Mexico’s mass graves with machine learning

“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


What happens when you look at crime by the numbers

Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


Investigating Boston Police Department SWAT Raids from 2012 to 2020

HRDAG collaborated with Data for Justice Project on a tool tool allowing members of the public to visualize and analyze nearly a decade of Boston Police Department SWAT team after-action reports. Tarak Shah of HRDAG is named in the acknowledgments.


Using Data to Reveal Human Rights Abuses

Profile touching on HRDAG’s work on the trial and conviction of Hissène Habré, its US Policing Project, data integrity, data archaeology and more.


Amnesty International Reports Organized Murder Of Detainees In Syrian Prison

100x100nprReports of torture and disappearances in Syria are not new. But the Amnesty International report says the magnitude and severity of abuse has “increased drastically” since 2011. Citing the Human Rights Data Analysis Group, the report says “at least 17,723 people were killed in government custody between March 2011 and December 2015, an average of 300 deaths each month.”


Civil War in Syria: The Internet as a Weapon of War

Suddeutsche Zeitung writer Hakan Tanriverdi interviews HRDAG affiliate Anita Gohdes and writes about her work on the Syrian casualty enumeration project for the UN Office of the High Commissioner for Human Rights. This article, “Bürgerkrieg in Syrien: Das Internet als Kriegswaffe,” is in German.


Syria’s status, the migrant crisis and talking to ISIS

In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.


Kriege und Social Media: Die Daten sind nicht perfekt

Suddeutsche Zeitung writer Mirjam Hauck interviewed HRDAG affiliate Anita Gohdes about the pitfalls of relying on social media data when interpreting violence in the context of war. This article, “Kriege und Social Media: Die Daten sind nicht perfekt,” is in German.


5 Humanitarian FOSS Projects to Watch

Dave Neary described “5 Humanitarian FOSS Projects to Watch,” listing HRDAG’s work on police homicides in the U.S. and other human rights abuses in other countries.


That Higher Count Of Police Killings May Still Be 25 Percent Too Low.

Carl Bialik of 538 Politics reports on a new HRDAG study authored by Kristian Lum and Patrick Ball regarding the Bureau of Justice Statistics report about the number of annual police killings, which was issued a few weeks ago. As Bialik writes, the HRDAG scientists extrapolated from their work in five other countries (Colombia, Guatemala, Kosovo, Sierra Leone and Syria) to estimate that the BJS study missed approximately one quarter of the total number of killings by police.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate