453 results for search: https:/www.hab.cl/buy-aciphex-baikal-pharmacycom-rtlx/feed/rss2/copyright
Killings of social movement leaders in Colombia: an estimation of the total population of victims – update 2018
Valentina Rozo Ángel and Patrick Ball (2019). Killings of social movement leaders in Colombia: an estimation of the total population of victims – update 2018. Human Rights Data Analysis Group. 10 December 2019. © HRDAG 2019. [English] [español]
AI for Human Rights
From the article: “Price described the touchstone of her organization as being a tension between how truth is simultaneously discovered and obscured. HRDAG is at the intersection of this tension; they are consistently participating in science’s progressive uncovering of what is true, but they are accustomed to working in spaces where this truth is denied. Of the many responsibilities HRDAG holds in its work is that of “speaking truth to power,” said Price, “and if that’s what you’re doing, you have to know that your truth stands up to adversarial environments.”
Preserving Human Rights Data with the Filecoin Network: A Journey into the Decentralized Web with HRDAG
Patrick Ball (2024). Preserving Human Rights Data with the Filecoin Network: A Journey into the Decentralized Web with HRDAG. Filecoin Foundation for the Decentralized Web. 18 April, 2024.
New Report Raises Questions Over CPD’s Approach to Missing Persons Cases
In this video, Trina Reynolds-Tyler of Invisible Institute talks about her work with HRDAG on the missing persons project in Chicago and Beneath the Surface.
PRIO 2023 Shortlist for Nobel Peace Prize
In a CNN interview predicting a Nobel Peace Prize winner, Henrik Urdal from PRIO talks about his shortlist and HRDAG.
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Drug-Related Killings in the Philippines
Patrick Ball, Sheila Coronel, Mariel Padilla and David Mora (2019). Drug-related killings in the Philippines. Human Rights Data Analysis Group. 26 July 2019. © HRDAG 2019.
Data-Driven Efforts to Address Racial Inequality
From the article: “As we seek to advance the responsible use of data for racial injustice, we encourage individuals and organizations to support and build upon efforts already underway.” HRDAG is listed in the Data Driven Activism and Advocacy category.
The Allegheny Family Screening Tool’s Overestimation of Utility and Risk
Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Allegheny Family Screening Tool’s Overestimation of Utility and Risk. Logic(s). 13 December, 2023. Issue 20.
Collaboration between the Colombian Truth Commission, the Special Jurisdiction for Peace, and HRDAG (Dataset)
The Colombian Truth Commission (CEV), the Special Jurisdiction for Peace (JEP), and the Human Rights Data Analysis Group (HRDAG) have worked together to integrate data and calculate statistical estimates of the number of victims of the armed conflict, including homicides, forced disappearances, kidnapping, and the recruitment of child soldiers. Data are available through National Administrative Department of Statistics (DANE), the Truth Commission, and GitHub.
Human Rights and the Decentralized Web
Analyzing patterns of violence in Colombia using more than 100 databases
Analizando los patrones de violencia en Colombia con más de 100 bases de datos
Measures of Fairness for New York City’s Supervised Release Risk Assessment Tool
Kristian Lum and Tarak Shah (2019). Measures of Fairness for New York City’s Supervised Release Risk Assessment Tool. Human Rights Data Analysis Group. 1 October 2019. © HRDAG 2019.
The World According to Artificial Intelligence (Part 2)
The World According to Artificial Intelligence – The Bias in the Machine (Part 2)
Artificial intelligence might be a technological revolution unlike any other, transforming our homes, our work, our lives; but for many – the poor, minority groups, the people deemed to be expendable – their picture remains the same.
Patrick Ball is interviewed: “The question should be, Who bears the cost when a system is wrong?”
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Unbiased algorithms can still be problematic
“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”
HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.
The causal impact of bail on case outcomes for indigent defendants in New York City
Kristian Lum, Erwin Ma and Mike Baiocchi (2017). The causal impact of bail on case outcomes for indigent defendants in New York City. Observational Studies 3 (2017) 39-64. 31 October 2017. © 2017 Institute of Mathematical Statistics.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”