325 results for search: P templat kasino{WWW,RT33,TOP}kodeb77}situs hold'em seluler╹Peluang Svalbard Jan Mayen🔒Omaha Hold'emTaruhan Kuwait⋉DinamodresdenàȘ§Montreal Canadiens⊑Bergabunglah dengan MegaslotÓźbisbol hari ini👃.qdq


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Reflections: The G in HRDAG is the Real Fuel

It took me a while to realize I had become part of the HRDAG incubator—at least that’s what it felt like to me—for young data analysts who wanted to use statistical knowledge to make a real impact on human rights debates.

Rise of the racist robots – how AI is learning all our worst impulses

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.


Mapping Mexico’s hidden graves

When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


Big data may be reinforcing racial bias in the criminal justice system

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.


Estimating the human toll in Syria

Megan Price (2017). Estimating the human toll in Syria. Nature. 8 February 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behaviour. ISSN 2397-3374.


DATNAV: New Guide to Navigate and Integrate Digital Data in Human Rights Research

DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule KrĂŒger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.

DatNav is the result of a collaboration between Amnesty International, Benetech, and The Engine Room, which began in late 2015 culminating in an intense four-day writing sprint facilitated by Chris Michael and Collaborations for Change in May 2016. HRDAG consultant Jule KrĂŒger is a contributor, and HRDAG director of research Patrick Ball is a reviewer.


Predictive policing violates more than it protects

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


Amnesty International Reports Organized Murder Of Detainees In Syrian Prison

100x100nprReports of torture and disappearances in Syria are not new. But the Amnesty International report says the magnitude and severity of abuse has “increased drastically” since 2011. Citing the Human Rights Data Analysis Group, the report says “at least 17,723 people were killed in government custody between March 2011 and December 2015, an average of 300 deaths each month.”


.outter-wrapper.feature { background: #15795b; } .outter-wrapper.feature hr { border-width: 0; height: 30px; } .outter-wrapper.feature h4 { /* height: 30px; */ border-width: 0; } .wrapper { padding: 20px 0; } .branding-headline { width: 100%; font-size: 40px; font-weight: 600; padding-bottom: 20px; color: #15795b; line-height: 43.2px; } .border-line { border-bottom: 1px solid #000; margin: 20px 0; } .hed-dek-illo { margin: 20px 0; } .illo { width: 100%; min-height: 200px; } .illo img { margin: 0; } .blog-pages { display: flex; } .blog-post { flex: 0 0 ...

Cifra de lĂ­deres sociales asesinados es mĂĄs alta: Dejusticia

Contrario a lo que se puede pensar, los datos oficiales sobre lĂ­deres sociales asesinados no necesariamente corresponden a la realidad y podrĂ­a haber mucha mayor victimizaciĂłn en las regiones golpeadas por este flagelo, segĂșn el mĂĄs reciente informe del Centro de Estudios de Justicia, Derecho y Sociedad (Dejusticia) en colaboraciĂłn con el Human Rights Data Analysis Group.


Counting The Dead: How Statistics Can Find Unreported Killings

Ball analyzed the data reporters had collected from a variety of sources – including on-the-ground interviews, police records, and human rights groups – and used a statistical technique called multiple systems estimation to roughly calculate the number of unreported deaths in three areas of the capital city Manila.

The team discovered that the number of drug-related killings was much higher than police had reported. The journalists, who published their findings last month in The Atlantic, documented 2,320 drug-linked killings over an 18-month period, approximately 1,400 more than the official number. Ball’s statistical analysis, which estimated the number of killings the reporters hadn’t heard about, found that close to 3,000 people could have been killed – more than three times the police figure.

Ball said there are both moral and technical reasons for making sure everyone who has been killed in mass violence is counted.

“The moral reason is because everyone who has been murdered should be remembered,” he said. “A terrible thing happened to them and we have an obligation as a society to justice and to dignity to remember them.”


Lies, Damned Lies, and “Official” Statistics

Megan Price and Maria Gargiulo (2021). Lies, Damned Lies, and "Official" Statistics. Health and Human Rights Journal. 24 June, 2021. © Health and Human Rights Journal.

Megan Price and Maria Gargiulo (2021). Lies, Damned Lies, and “Official” Statistics. Health and Human Rights Journal. 24 June, 2021. © Health and Human Rights Journal.


The True Dangers of AI are Closer Than We Think

William Isaac is quoted.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate