705 results for search: %E3%80%8E%EB%8F%84%EB%B4%89%EA%B5%AC%EC%83%81%ED%99%A9%EA%B7%B9%E3%80%8F%20O6O%E3%85%A15O1%E3%85%A19997%20%EC%82%AC%EC%8B%AD%EB%8C%80%EB%8C%80%ED%99%94%EC%96%B4%ED%94%8C%20%EC%BB%A4%ED%94%8C%EC%BB%A4%EB%AE%A4%EB%8B%88%ED%8B%B0%E2%86%95%EB%AF%B8%EC%8A%A4%EB%85%80%EB%8D%B0%EC%9D%B4%ED%8C%85%E2%92%AE%EB%B0%A9%EC%95%84%EC%83%81%ED%99%A9%EA%B7%B9%20%E3%83%8D%E5%AF%9D%20bifoliate/feed/content/colombia/copyright
In this afternoon "Lightning Talk" at RightsCon 2014, Megan Price spoke about the importance of using models to adjust for variability when reporting human rights violations and mentioned innovative tools that can be used for tracking abuses.
RIGHTSCON
March 4, 2014
San Francisco, California
Link to RightsCon program
Back to Talks
"Revolution Analytics will allow HRDAG to handle bigger data sets and leverage the power of R to accomplish this goal and uncover the truth." Director of Research Megan Price is quoted.
REVOLUTION ANALYTICS
Press release
February 4, 2014
Link to press release
Back to Press Room
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
Megan Price (2022). Beautiful game, ugly truth? Significance, 19: 18-21. December 2022. © The Royal Statistical Society. https://doi.org/10.1111/1740-9713.01702
Megan Price (2022). Beautiful game, ugly truth? Significance, 19: 18-21. December 2022. © The Royal Statistical Society. https://doi.org/10.1111/1740-9713.01702
Nick Cumming-Bruce of the New York Times writes about the UN Office of the High Commissioner of Human Right's release of HRDAG's third report on reported killings in the Syrian conflict.
From the article:
In its third report on Syria commissioned by the United Nations, the Human Rights Data Analysis Group identified 191,369 deaths from the start of the conflict in March 2011 to April 2014, more than double the 92,901 deaths cited in their last report, which covered the first two years of the conflict.
“Tragically, it is probably an underestimate of the real total number of people killed during the first three years of this murderous conflict,” ...
Violence against women in all its forms is a human rights violation. Most of our HRDAG colleagues are women, and for us, unfortunately, recent campaigns such as #metoo are unsurprising.
HRDAG and our partners Data Cívica and the Iberoamericana University created a machine-learning model to predict which counties (municipios) in Mexico have the highest probability of unreported hidden graves. The predictions help advocates to bring public attention and government resources to search for the disappeared in the places where they are most likely to be found.
Context
For more than ten years, Mexican authorities have been discovering hidden graves (fosas clandestinas). The casualties are attributed broadly—and sometimes inaccurately—to the country’s “drug war,” but the motivations and perpetrators behind the mass murders ...
The summer of 2002 in Washington, DC, was steamy and hot, which is how I remember my introduction to HRDAG. I had begun working with them, while they were still at AAAS, in the late spring, learning all about their core concepts: duplicate reporting and MSE, controlled vocabularies, inter-rater reliability, data models and more. The days were long, with a second shift more often than not running late into the evening. In addition to all the learning, I also helped with matching for the Chad project – that is, identifying multiple records of the same violation – back when matching was done by hand. But it was not long after I arrived in Washington ...
I joined the Benetech Human Rights Program at essentially the same time that HRDAG did, coming to Benetech from years of analyzing data for large companies in the transportation, hospitality and retail industries. But the data that HRDAG dealt with was not like the data I was familiar with, and I was fascinated to learn about how they used the data to determine "who did what to whom." Although some of the methodologies were similar to what I had experience with in the for-profit sector, the goals and beneficiaries of the analyses were very different.
At Benetech, I was initially predominantly focused on product management for Martus, a free ...
The interview poses questions about Lum's focus on artificial intelligence and its impact on predictive policing and sentencing programs.
In this story about how data are transforming the nonprofit world, Patrick Ball is quoted. Here's an excerpt:
"Data can have a profound impact on certain problems, but nonprofits are kidding themselves if they think the data techniques used by corporations can be applied wholesale to social problems," says Patrick Ball, head of the nonprofit Human Rights Data Analysis Group.
Companies, he says, maintain complete data sets. A business knows every product it made last year, when it sold, and to whom. Charities, he says, are a different story.
"If you're looking at poverty or trafficking or homicide, we don't have all the data, and we're not going to," ...
#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; }
/* Add your own Mailchimp form style overrides in your site stylesheet or in this style block.
We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */
#mc_embed_signup .mc-field-group input {
display: block;
width: 100%;
padding: 8px 0;
text-indent: 2%;
color: #333 !important;
}
Subscribe
* indicates required
Email Address *
First Name
Last Name
Organization
(function($) {window.fnames = new ...
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
HRDAG is delighted to announce five additions to our team: one new staff member, three summer interns, and one fellow.