714 results for search: %EB%82%A8%EA%B5%AC%ED%9C%B4%EA%B2%8C%ED%85%94%E3%81%ACjusobot%E3%80%81%EF%BD%830m%E2%99%99%EB%82%A8%EA%B5%AC%EA%B1%B4%EB%A7%88%E2%99%A9%EB%82%A8%EA%B5%AC%EC%97%85%EC%86%8C%E2%9C%88%EB%82%A8%EA%B5%AC%EB%A6%BD%EB%B0%A9%20%EB%82%A8%EA%B5%AC%EB%8B%AC%EB%A6%BC/feed/content/colombia/SV-report_2011-04-26.pdf
Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project
Patrick Ball. Who Did What to Whom? Planning and Implementing a Large Scale Human Rights Data Project. © 1996 American Association for the Advancement of Science.
A Statistical Analysis of the Guatemalan National Police Archive: Searching for Documentation of Human Rights Abuses.
Megan E. Price, Tamy Guberek, Daniel R. Guzmán, Paul Zador, Gary M. Shapiro (2009). “A Statistical Analysis of the Guatemalan National Police Archive: Searchingfor Documentation of Human Rights Abuses.”In JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association.
The Demography of Large-Scale Human Rights Atrocities: Integrating demographic and statistical analysis into post-conflicthistorical clarification in Timor-Leste.
Romesh Silva and Patrick Ball. “The Demography of Large-Scale Human Rights Atrocities: Integrating demographic and statistical analysis into post-conflicthistorical clarification in Timor-Leste.” Paper presented at the 2006 meetings of the Population Association of America.
¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.
Patrick Ball (2008). “¿Quién le hizo qué a quién? Planear e implementar un proyecto a gran escala de información en derechos humanos.” (originally in English at AAAS) Translated by Beatriz Verjerano. Palo Alto, California: Benetech.
Ouster of Guatemala’s Attorney General
Haiti
Experts Greet Kosovo Memory Book
All the Dead We Cannot See
Ball, a statistician, has spent the last two decades finding ways to make the silence speak. He helped pioneer the use of formal statistical modeling, and, later, machine learning—tools more often used for e-commerce or digital marketing—to measure human rights violations that weren’t recorded. In Guatemala, his analysis helped convict former dictator General Efraín Ríos Montt of genocide in 2013. It was the first time a former head of state was found guilty of the crime in his own country.
Meet the data analyst putting the perpetrators of genocide in prison
Biotechniques published an interview with Patrick Ball, inspired by his John Maddox Prize award.
Why top funders back this small human rights organization with a global reach
Eric Sears, a director at the MacArthur Foundation who leads the grantmaker’s Technology in the Public Interest program, worked at Human Rights First and Amnesty International before joining MacArthur, and has been following HRDAG’s work for years. … One of HRDAG’s strengths is the long relationships it maintains with partners around the globe. “HRDAG is notable in that it really develops deep relationships and partnerships and trust with organizations and actors in different parts of the world,” Sears said. “I think they’re unique in the sense that they don’t parachute into a situation and do a project and leave. They tend to stick with organizations and with issues over the long term, and continually help build cases around evidence and documentation to ensure that when the day comes, when accountability is possible, the facts and the evidence are there.”
What happens when you look at crime by the numbers
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
The Ways AI Decides How Low-Income People Work, Live, Learn, and Survive
HRDAG is mentioned in the “child welfare (sometimes called “family policing”)” section: At least 72,000 low-income children are exposed to AI-related decision-making through government child welfare agencies’ use of AI to determine if they are likely to be neglected. As a result, these children experience heightened risk of being separated from their parents and placed in foster care.