609 results for search: %ED%99%8D%EB%B3%B4%ED%8C%80%E2%98%8E%EC%B9%B4%ED%86%A1adgogo%E2%98%8E%EC%99%95%EC%8B%AD%EB%A6%AC%EC%97%AD%EC%A3%BC%EC%A0%90%E3%83%A5%ED%99%8D%EB%B3%B4%E2%94%82%ED%8C%80%E2%97%95%EC%99%95%EC%8B%AD%EB%A6%AC%EC%97%AD%E8%B1%B7%EC%A3%BC%EC%A0%90%E8%B1%88endocardium


The Death Toll in Syria


Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse

Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”


Beautiful game, ugly truth?

Megan Price (2022). Beautiful game, ugly truth? Significance, 19: 18-21. December 2022. © The Royal Statistical Society. https://doi.org/10.1111/1740-9713.01702

Megan Price (2022). Beautiful game, ugly truth? Significance, 19: 18-21. December 2022. © The Royal Statistical Society. https://doi.org/10.1111/1740-9713.01702


Estimating Deaths in Timor-Leste


Big data may be reinforcing racial bias in the criminal justice system

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.

Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.


Syria’s status, the migrant crisis and talking to ISIS

In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.


Truth Commissioner


Truth Commissioner

From the Guatemalan military to the South African apartheid police, code cruncher Patrick Ball singles out the perpetrators of political violence.


Predictive policing violates more than it protects

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.

William Isaac and Kristian Lum. Predictive policing violates more than it protects. USA Today. December 2, 2016. © USA Today.


First Things First: Assessing Data Quality Before Model Quality.

Anita Gohdes and Megan Price (2013). Journal of Conflict Resolution, Volume 57 Issue 6 December 2013. © 2013 Journal of Conflict Resolution. All rights reserved. Reprinted with permission of SAGE. [online abstract]DOI: 10.1177/0022002712459708.


To predict and serve?

Kristian Lum and William Isaac (2016). To predict and serve? Significance. October 10, 2016. © 2016 The Royal Statistical Society. 

Kristian Lum and William Isaac (2016). To predict and serve? Significance. October 10, 2016. © 2016 The Royal Statistical Society. 


HRDAG and Boston PD SWAT Reports

HRDAG worked with the ACLU of Massachusetts to review Boston PD SWAT reports (these are the reports filled out before and after tactical and warrant service operations) made public under the 17F order, which requires the Mayor of Boston to release information about the Boston Police Department’s inventory of military-grade equipment, such as mine-resistant ambush-protected armored vehicles, designed for use in Iraq. Investigating Boston Police Department SWAT Raids from 2012 to 2020 HRDAG collaborated with Data for Justice Project on a tool tool allowing members of the public to visualize and analyze nearly a decade of Boston Police ...

Making the Case: The Role of Statistics in Human Rights Reporting.

Patrick Ball. “Making the Case: The Role of Statistics in Human Rights Reporting.” Statistical Journal of the United Nations Economic Commission for Europe. 18(2-3):163-174. 2001.


Selection Bias and the Statistical Patterns of Mortality in Conflict.

Megan Price and Patrick Ball. 2015. Statistical Journal of the IAOS 31: 263–272. doi: 10.3233/SJI-150899. © IOS Press and the authors. All rights reserved. Creative Commons BY-NC-SA.


South Africa

Under apartheid, South Africans from all sides suffered violence and human rights abuses. One of the mandates of the the South African Truth and Reconciliation Commission (TRC) was to report truth by reporting on violations and victims. Dr. Patrick Ball, as Deputy Director of the Science and Human Rights Program (SHRP) of the American Association for the Advancement of Science (AAAS), used the who-did-what-to-whom data model to provide statistical analysis of the violations reported to the Commission, for use in the final report of the TRC.     Links: http://shr.aaas.org/southafrica/trcsa/ http://www.doj.gov.za/trc/index....

Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


The True Dangers of AI are Closer Than We Think

William Isaac is quoted.


Here’s how an AI tool may flag parents with disabilities

HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”


Data-driven crime prediction fails to erase human bias

Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”


Quantifying Injustice

“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol.  … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate