654 results for search: %EC%98%A8%EB%9D%BC%EC%9D%B8%ED%99%8D%EB%B3%B4%E2%99%AC%ED%86%A1adgogo%E2%99%AC%EC%A7%80%EC%82%AC%EB%A9%B4%EC%84%B1%EC%9D%B8%E3%84%93%EC%98%A8%EB%9D%BC%EC%9D%B8%E2%95%8A%ED%99%8D%EB%B3%B4%EF%BC%A0%EC%A7%80%EC%82%AC%EB%A9%B4%E6%99%B9%EC%84%B1%EC%9D%B8%E4%AF%9Apaperclip/feed/content/india/privacy
The killings of social movement leaders and human rights defenders in Colombia 2018 – 2023: an estimate of the universe.
HRDAG + Dejusticia
Abstract: In 2018, Dejusticia and HRDAG published our first report estimating the total number of social leaders killed in Colombia during 2016-2017. Additionally, we demonstrated that a statistical method known as “capture-recapture” could be used to estimate the underreporting of murdered social leaders. Moreover, our estimate closely matched the total documented by the organizations collectively. A year later, we released a second report, updating the data to include 2018. Five years later, we revisited this exercise to cover the period from 2019 to 2023, focusing on three of the original six organizations. Read the article on HRDAG (en español).
Valentina Rozo Ángel and Patrick Ball. 2024. Asesinatos de líderes sociales y defensores de derechos en Colombia: en estimación del universo actualización 2019 – 2023. Human Rights Data Analysis Group. 18 December 2024. © HRDAG 2024.
Creative Commons International license 4.0.
HRDAG’s Year in Review: 2022
Tech Note: Chicago Missing Persons
Gaza death toll 40% higher than official number, Lancet study finds
“Patrick Ball, a statistician at the US-based Human Rights Data Analysis Group not involved in the research, has used capture-recapture methods to estimate death tolls for conflicts in Guatemala, Kosovo, Peru and Colombia.
Ball told AFP the well-tested technique had been used for centuries and that the researchers had reached “a good estimate” for Gaza.”
Talks & Discussions
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
How much faith can we place in coronavirus antibody tests?
In Solidarity
Scatter and keep working
Perú
Quantifying Injustice
“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol. … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
The True Dangers of AI are Closer Than We Think
William Isaac is quoted.
Data-driven crime prediction fails to erase human bias
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
