708 results for search: %EC%95%84%EA%B8%B0%EB%A7%98%EB%AF%B8%ED%8C%85%E2%99%AA%EB%B3%B4%EA%B8%B0%ED%8F%B0%ED%8C%85%E2%80%A2O%E2%91%B9O-%E2%91%BCO%E2%91%B6-O%E2%91%BAO%E2%91%BA%E2%99%AA%20%EC%9D%B4%EB%B0%B1%EB%A7%98%EB%AF%B8%ED%8C%85%20%ED%8C%8C%EC%B6%9C%EB%B6%80%EB%AF%B8%ED%8C%85%E2%88%AE%EA%B3%A0%EC%84%B1%EB%85%80%EB%AF%B8%ED%8C%85%F0%9F%94%8B%EB%8F%99%EC%95%88%EB%AF%B8%EB%85%80%EB%AF%B8%ED%8C%85%20%E8%85%9E%E6%AD%B4completeness%EC%95%84%EA%B8%B0%EB%A7%98%EB%AF%B8%ED%8C%85/feed/content/colombia/copyright


Benetech Scientists Publish Analysis of Indirect Sampling Methods in the Journal of the American Medical Association


Former Leader of Guatemala Is Guilty of Genocide Against Mayan Group


Improving the estimate of U.S. police killings

Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik’s article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics.


How statistics caught Indonesia’s war-criminals


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Inside the Difficult, Dangerous Work of Tallying the ISIS Death Toll

HRDAG executive director Megan Price is interviewed by Mother Jones. An excerpt: “Violence can be hidden,” says Price. “ISIS has its own agenda. Sometimes that agenda is served by making public things they’ve done, and I have to assume, sometimes it’s served by hiding things they’ve done.”


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


How statistics lifts the fog of war in Syria

Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”


Data Analysis By Benetech Scientists Aid in Arrest of Former Guatemalan Police Chief


Weapons of Math Destruction

Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:

As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.


How data science is changing the face of human rights

100x100siliconangleOn the heels of the Women in Data Science conference, HRDAG executive director Megan Price says, “I think creativity and communication are probably the two most important skills for a data scientist to have these days.”


The Forensic Humanitarian

International human rights work attracts activists and lawyers, diplomats and retired politicians. One of the most admired figures in the field, however, is a ponytailed statistics guru from Silicon Valley named Patrick Ball, who has spent nearly two decades fashioning a career for himself at the intersection of mathematics and murder. You could call him a forensic humanitarian.


The Atrocity Archives


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


A Human Rights Statistician Finds Truth In Numbers

The tension started in the witness room. “You could feel the stress rolling off the walls in there,” Patrick Ball remembers. “I can remember realizing that this is why lawyers wear sport coats – you can’t see all the sweat on their arms and back.” He was, you could say, a little nervous to be cross-examined by Slobodan Milosevic.


Death March

A mapped representation of the scale and spread of killings in Syria. HRDAG’s director of research, Megan Price, is quoted.


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Data Mining on the Side of the Angels

“Data, by itself, isn’t truth.” How HRDAG uses data analysis and statistical methods to shed light on mass human rights abuses. Executive director Patrick Ball is quoted from his speech at the Chaos Communication Congress in Hamburg, Germany.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate