155 results for search: 〔만남폰팅〕 WWW.MEDA.PW 여서맘친구 여서맘커플ŀ여서맘파트너여서맘폰섹앱⑬ㄉ綯abdicate


Press Release, Timor-Leste, February 2006

SILICON VALLEY GROUP USES TECHNOLOGY TO HELP THE TRUTH COMMISSION ANSWER DISPUTED QUESTIONS ABOUT MASSIVE POLITICAL VIOLENCE IN TIMOR-LESTE Palo Alto, CA, February 9, 2006 – The Benetech® Initiative today released a statistical report detailing widespread and systematic violations in Timor-Leste during the period 1974-1999. Benetech's statistical analysis establishes that at least 102,800 (+/- 11,000) Timorese died as a result of the conflict. Approximately 18,600 (+/- 1000) Timorese were killed or disappeared, while the remainder died due to hunger and illness in excess of what would be expected due to peacetime mortality. The magnitude of deaths ...

Covid-19 Research and Resources

HRDAG is identifying and interpreting the best science we can find to shed light on the global crisis brought on by the novel coronavirus, about which we still know so little. Right now, most of the data on the virus SARS-CoV-2 and Covid-19, the condition caused by the virus, are incomplete and unrepresentative, which means that there is a great deal of uncertainty. But making sense of imperfect datasets is what we do. HRDAG is contributing to a better understanding with explainers, essays, and original research, and we are highlighting trustworthy resources for those who want to dig deeper. Papers and articles by HRDAG .ugb-bbeb275 .ugb-blo...

FAQs on Predictive Policing and Bias

Last month Significance magazine published an article on the topic of predictive policing and police bias, which I co-authored with William Isaac. Since then, we've published a blogpost about it and fielded a few recurring questions. Here they are, along with our responses. Do your findings still apply given that PredPol uses crime reports rather than arrests as training data? Because this article was meant for an audience that is not necessarily well-versed in criminal justice data and we were under a strict word limit, we simplified language in describing the data. The data we used is a version of the Oakland Police Department’s crime report...

Counting the Dead in Sri Lanka

ITJP and HRDAG are urging groups inside and outside Sri Lanka to share existing casualty lists.

The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Amnesty International Reports Organized Murder Of Detainees In Syrian Prison

100x100nprReports of torture and disappearances in Syria are not new. But the Amnesty International report says the magnitude and severity of abuse has “increased drastically” since 2011. Citing the Human Rights Data Analysis Group, the report says “at least 17,723 people were killed in government custody between March 2011 and December 2015, an average of 300 deaths each month.”


Estimating the human toll in Syria

Megan Price (2017). Estimating the human toll in Syria. Nature. 8 February 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behaviour. ISSN 2397-3374.


Big data may be reinforcing racial bias in the criminal justice system

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.


What happens when you look at crime by the numbers

Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Mapping Mexico’s hidden graves

When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.


Predictive policing violates more than it protects

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.


Rise of the racist robots – how AI is learning all our worst impulses

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.


Reflections: The G in HRDAG is the Real Fuel

It took me a while to realize I had become part of the HRDAG incubator—at least that’s what it felt like to me—for young data analysts who wanted to use statistical knowledge to make a real impact on human rights debates.

Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Cifra de líderes sociales asesinados es más alta: Dejusticia

Contrario a lo que se puede pensar, los datos oficiales sobre líderes sociales asesinados no necesariamente corresponden a la realidad y podría haber mucha mayor victimización en las regiones golpeadas por este flagelo, según el más reciente informe del Centro de Estudios de Justicia, Derecho y Sociedad (Dejusticia) en colaboración con el Human Rights Data Analysis Group.


Los asesinatos de líderes sociales que quedan fuera de las cuentas

Una investigación de Dejusticia y Human Rights Data Analysis Group concluyó que hay un subconteo en los asesinatos de líderes sociales en Colombia. Es decir, que el aumento de estos crímenes en 2016 y 2017 podría ser incluso mayor al reportado por las organizaciones y por las cifras oficiales.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate