529 results for search: Seguro coche barato Boca Raton FL llama ahora al 888-430-8975 Cual es la aseguradora de coches mas barata Cotizar seguro de auto Cotizador de seguros la caja Seguro de coche para un mes Precio medio seguro coche Correduria de seguros
HRDAG’s Year in Review: 2023
Welcoming Our 2019-2020 Visiting Data Science Student
Data Science Symposium at Vanderbilt
Our Thoughts on the Violence in Charlottesville
Donate
Data Mining on the Side of the Angels
“Data, by itself, isn’t truth.” How HRDAG uses data analysis and statistical methods to shed light on mass human rights abuses. Executive director Patrick Ball is quoted from his speech at the Chaos Communication Congress in Hamburg, Germany.
Data Mining on the Side of the Angels
Welcoming a New Board Member
Welcoming Our New Admistrative Coordinator
HRDAG To Join the Partnership on AI
Tech Note – improving LLM-driven info extraction
Lessons at HRDAG: Holding Public Institutions Accountable
Welcoming Our New HRDAG Data Scientist
HRDAG’s Year in Review: 2020
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
How Many People Will Get Covid-19?
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.