705 results for search: %ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%EB%A7%88%EC%BC%80%ED%8C%85%EB%AC%B8%EC%9D%98%E2%99%A5%EC%B9%B4%ED%86%A1%40adgogo%E2%99%A5%ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%E3%82%9D%EA%B0%95%EC%B6%94%E2%94%92%EB%A7%88%EC%BC%80%ED%8C%85%ED%9A%8C%EC%82%AC%DB%A9%EB%B0%94%EC%9D%B4%EB%9F%B4%EB%8C%80%ED%96%89%EC%82%AC%E4%AB%A0%ED%83%80%EC%9D%B4%EB%A7%88%EC%82%AC%EC%A7%80%E7%9C%8Crabidity/feed/content/india/privacy
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Sierra Leone TRC Data and Statistical Appendix
Data Mining for Good: CJA Drink + Think
100 Women in AI Ethics
We live in very challenging times. The pervasiveness of bias in AI algorithms and autonomous “killer” robots looming on the horizon, all necessitate an open discussion and immediate action to address the perils of unchecked AI. The decisions we make today will determine the fate of future generations. Please follow these amazing women and support their work so we can make faster meaningful progress towards a world with safe, beneficial AI that will help and not hurt the future of humanity.
53. Kristian Lum @kldivergence
The Limits of Observation for Understanding Mass Violence.
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
Partners
Data-driven development needs both social and computer scientists
Excerpt:
Data scientists are programmers who ignore probability but like pretty graphs, said Patrick Ball, a statistician and human rights advocate who cofounded the Human Rights Data Analysis Group.
“Data is broken,” Ball said. “Anyone who thinks they’re going to use big data to solve a problem is already on the path to fantasy land.”
HRDAG’s Year in Review: 2021
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Truth Commissioner
From the Guatemalan military to the South African apartheid police, code cruncher Patrick Ball singles out the perpetrators of political violence.
How Structuring Data Unburies Critical Louisiana Police Misconduct Data
Syria’s status, the migrant crisis and talking to ISIS
In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.
Selection Bias and the Statistical Patterns of Mortality in Conflict.
Megan Price and Patrick Ball. 2015. Statistical Journal of the IAOS 31: 263–272. doi: 10.3233/SJI-150899. © IOS Press and the authors. All rights reserved. Creative Commons BY-NC-SA.