535 results for search: pills/%E2%A5%9C%F0%9F%98%BF%20Buy%20Stromectol%20%F0%9F%94%B0%20www.Ivermectin-Stromectol.com%20%F0%9F%94%B0%20Stromectol%20Price%3A%20from%20%242.85%20%F0%9F%98%BF%E2%A5%9C%20Order%20Stromectol%203%20Mg%2FIvermectin%20Buy%20Stromectol%2012%20Mg/feed/content/colombia/copyright


‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.

“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.


Speaking Stats to Justice: Expert Testimony in a Guatemalan Human Rights Trial Based on Statistical Sampling


United Nations Issues Report on Deaths in Syria


Data Dive Reveals 15,000 New Victims of Syria War


60,000 Dead in Syria? Why the Death Toll is Likely Even Higher


Unbiased algorithms can still be problematic

“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”

HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.


Former Leader of Guatemala Is Guilty of Genocide Against Mayan Group


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Death March

A mapped representation of the scale and spread of killings in Syria. HRDAG’s director of research, Megan Price, is quoted.


Calculating US police killings using methodologies from war-crimes trials

100x100-boingboing-logoCory Doctorow of Boing Boing writes about HRDAG director of research Patrick Ball’s article “Violence in Blue,” published March 4 in Granta. From the post: “In a must-read article in Granta, Ball explains the fundamentals of statistical estimation, and then applies these techniques to US police killings, merging data-sets from the police and the press to arrive at an estimate of the knowable US police homicides (about 1,250/year) and the true total (about 1,500/year). That means that of all the killings by strangers in the USA, one third are committed by the police.”


Improving the estimate of U.S. police killings

Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik’s article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics.


Sous la dictature d’Hissène Habré, le ridicule tuait

Patrick Ball, un expert en statistiques engagé par les Chambres africaines extraordinaires, a conclu que la « mortalité dans les prisons de la DDS fut substantiellement plus élevée que celles des pires contextes du XXe siècle de prisonniers de guerre ».


Inside the Difficult, Dangerous Work of Tallying the ISIS Death Toll

HRDAG executive director Megan Price is interviewed by Mother Jones. An excerpt: “Violence can be hidden,” says Price. “ISIS has its own agenda. Sometimes that agenda is served by making public things they’ve done, and I have to assume, sometimes it’s served by hiding things they’ve done.”


New UN report counts 191,369 Syrian-war deaths — but the truth is probably much, much worse

Amanda Taub of Vox has interviewed HRDAG executive director about the UN Office of the High Commissioner of Human Right’s release of HRDAG’s third report on reported killings in the Syrian conflict.
From the article:
Patrick Ball, Executive Director of the Human Rights Data Analysis Group and one of the report’s authors, explained to me that this new report is not a statistical estimate of the number of people killed in the conflict so far. Rather, it’s an actual list of specific victims who have been identified by name, date, and location of death. (The report only tracked violent killings, not “excess mortality” deaths from from disease or hunger that the conflict is causing indirectly.)


How statistics lifts the fog of war in Syria

Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”


Data Mining on the Side of the Angels

“Data, by itself, isn’t truth.” How HRDAG uses data analysis and statistical methods to shed light on mass human rights abuses. Executive director Patrick Ball is quoted from his speech at the Chaos Communication Congress in Hamburg, Germany.


Can ‘predictive policing’ prevent crime before it happens?

100x100-sciencemagHRDAG analyst William Isaac is quoted in this article about so-called crime prediction. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.”


La misión de contar muertos


Estimating Deaths


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate