546 results for search: pills/%E2%A5%9C%F0%9F%98%BF%20Buy%20Stromectol%20%F0%9F%94%B0%20www.Ivermectin-Stromectol.com%20%F0%9F%94%B0%20Stromectol%20Price%3A%20from%20%242.85%20%F0%9F%98%BF%E2%A5%9C%20Order%20Stromectol%203%20Mg%2FIvermectin%20Buy%20Stromectol%2012%20Mg/feed/content/colombia/privacy


500 Tamils disappeared in Army custody — New Study

The Sri Lankan army must explain to the families of the disappeared and missing what happened to an estimated 500 Tamils who disappeared in their custody at the war end on/around 18 May 2009, said two international NGOs who have been collating and analysing lists of names.

Sri Lanka has one of the largest numbers in the world of enforced disappearances but these 500 represent the largest number of disappearances all in one place and time in the country. For a detailed account of the process of estimating the 500 please see: “How many people disappeared on 17-19 May 2009 in Sri Lanka?” .


Unbiased algorithms can still be problematic

“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”

HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.


What HBR Gets Wrong About Algorithms and Bias

“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”


Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice

“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.

I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”


Data ‘hashing’ improves estimate of the number of victims in databases

But while HRDAG’s estimate relied on the painstaking efforts of human workers to carefully weed out potential duplicate records, hashing with statistical estimation proved to be faster, easier and less expensive. The researchers said hashing also had the important advantage of a sharp confidence interval: The range of error is plus or minus 1,772, or less than 1 percent of the total number of victims.

“The big win from this method is that we can quickly calculate the probable number of unique elements in a dataset with many duplicates,” said Patrick Ball, HRDAG’s director of research. “We can do a lot with this estimate.”


The Data Scientist Helping to Create Ethical Robots

Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.

What’s the relationship between statistics and AI and machine learning?

AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.


5 Questions for Kristian Lum

Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.


Celebrating Women in Statistics

kristian lum headshot 2018In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.


Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Former Leader of Guatemala Is Guilty of Genocide Against Mayan Group


Guatemala: Access to Archives Sheds Light on Case of Forced Disappearance


Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts

Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”


Carnegie Mellon Partners With Human Rights Data Analysis Group To Improve Syrian Casualty Reporting


Calculations for the Greater Good

Rollins School of Public HealthAs executive director of the Human Rights Data Analysis Group, Megan Price uses statistics to shine the light on human rights abuses.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Justice Served in Guatemala: Testimonies from The National Security Archive & Benetech’s Human Rights Data Analysis Group


Media Contact

To speak with the researchers at HRDAG, please fill out the form below. You can search our Press Room by keyword or by year.

Guatemalan Ex-Cops Get 40 Years for Labor Leader’s Slaying


Guilty Verdict and 40 year Maximum Sentence in Edgar Fernando Garcia Case


I Wanted Him Back Alive, An Account of Edgar Fernando García’s Case from Inside Tribunals Tower


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate