592 results for search: %EC%9D%B8%ED%84%B0%EB%84%B7%ED%99%8D%EB%B3%B4%E2%96%B2%E0%B4%A0%E2%9D%B6%E0%B4%A0%2B%E2%9D%BD%E2%9D%BD%E2%9D%BC%E2%9D%BB%2B%E2%9D%BD%E2%9D%BC%E2%9D%BC%E2%9D%BD%E2%96%B2%EC%83%81%ED%95%98%EB%8F%99%EC%95%88%EB%A7%88%E3%81%AB%EC%9D%B8%ED%84%B0%EB%84%B7%E2%94%9A%ED%99%8D%EB%B3%B4%E2%86%92%EC%83%81%ED%95%98%EB%8F%99%E5%AA%99%EC%95%88%EB%A7%88%E4%A2%8Ddesklight/feed/rss2/indiafaq


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Crean sistema para predecir fosas clandestinas en México

Por ello, Human Rights Data Analysis Group (HRDAG), el Programa de Derechos Humanos de la Universidad Iberoamericana (UIA) y Data Cívica, realizan un análisis estadístico construido a partir de una variable en la que se identifican fosas clandestinas a partir de búsquedas automatizadas en medios locales y nacionales, y usando datos geográficos y sociodemográficos.


El científico que usa estadísticas para encontrar desaparecidos en El Salvador, Guatemala y México

Patrick Ball es un sabueso de la verdad. Ese deseo de descubrir lo que otros quieren ocultar lo ha llevado a desarrollar fórmulas matemáticas para detectar desaparecidos.

Su trabajo consiste en aplicar métodos de medición científica para comprobar violaciones masivas de derechos humanos.


Mapping Mexico’s hidden graves

When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.


Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero

El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.


Fosas clandestinas en México manifiestan existencia de crímenes de lesa humanidad

Patrick Ball, estadístico norteamericano, colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana en una investigación sobre fosas clandestinas.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


What happens when you look at crime by the numbers

Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”


Documenting Syrian Deaths with Data Science

Coverage of Megan Price at the Women in Data Science Conference held at Stanford University. “Price discussed her organization’s behind-the-scenes work to collect and analyze data on the ground for human rights advocacy organizations. HRDAG partners with a wide variety of human rights organizations, including local grassroots non-governmental groups and—most notably—multiple branches of the United Nations.”


How data science is changing the face of human rights

100x100siliconangleOn the heels of the Women in Data Science conference, HRDAG executive director Megan Price says, “I think creativity and communication are probably the two most important skills for a data scientist to have these days.”


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


Improving the estimate of U.S. police killings

Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik’s article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics.


Recognising Uncertainty in Statistics

100x100-the-engine-roomIn Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”


New Estimate Of Killings By Police Is Way Higher — And Still Too Low

Carl Bialik of 538 Politics interviews HRDAG executive director Patrick Ball in an article about the recently released Bureau of Justice Statistics report about the number of annual police killings, both reported and unreported. As Bialik writes, this is a math puzzle with real consequences.


Calculating US police killings using methodologies from war-crimes trials

100x100-boingboing-logoCory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”


Justice by the Numbers

Wilkerson was speaking at the inaugural Conference on Fairness, Accountability, and Transparency, a gathering of academics and policymakers working to make the algorithms that govern growing swaths of our lives more just. The woman who’d invited him there was Kristian Lum, the 34-year-old lead statistician at the Human Rights Data Analysis Group, a San Francisco-based non-profit that has spent more than two decades applying advanced statistical models to expose human rights violations around the world. For the past three years, Lum has deployed those methods to tackle an issue closer to home: the growing use of machine learning tools in America’s criminal justice system.


‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.

“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.


Gaza death toll 40% higher than official number, Lancet study finds

“Patrick Ball, a statistician at the US-based Human Rights Data Analysis Group not involved in the research, has used capture-recapture methods to estimate death tolls for conflicts in Guatemala, Kosovo, Peru and Colombia.

Ball told AFP the well-tested technique had been used for centuries and that the researchers had reached “a good estimate” for Gaza.”


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate