455 results for search: https:/www.hab.cl/buy-aciphex-baikal-pharmacycom-rtlx/feed/rss2/copyright
Data-driven development needs both social and computer scientists
Excerpt:
Data scientists are programmers who ignore probability but like pretty graphs, said Patrick Ball, a statistician and human rights advocate who cofounded the Human Rights Data Analysis Group.
“Data is broken,” Ball said. “Anyone who thinks they’re going to use big data to solve a problem is already on the path to fantasy land.”
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.
SermonNew death toll estimated in Syrian civil war
Kevin Uhrmacher of the Washington Post prepared a graph that illustrates reported deaths over time, by number of organizations reporting the deaths.
What we’ll need to find the true COVID-19 death toll
From the article: “Intentionally inconsistent tracking can also influence the final tally, notes Megan Price, a statistician at the Human Rights Data Analysis Group. During the Iraq War, for example, officials worked to conceal mortality or to cherry pick existing data to steer the political narrative. While wars are handled differently from pandemics, Price thinks the COVID-19 data could still be at risk of this kind of manipulation.”
A Data Double Take: Police Shootings
“In a recent article, social scientist Patrick Ball revisited his and Kristian Lum’s 2015 study, which made a compelling argument for the underreporting of lethal police shootings by the Bureau of Justice Statistics (BJS). Lum and Ball’s study may be old, but it bears revisiting amid debates over the American police system — debates that have featured plenty of data on the excessive use of police force. It is a useful reminder that many of the facts and figures we rely on require further verification.”
500 Tamils forcibly disappeared in three days, after surrendering to army in 2009
A new study has estimated that over 500 Tamils were forcibly disappeared in just three days, after surrendering to the Sri Lankan army in May 2009.
The study, carried out by the Human Rights Data Analysis Group and the International Truth and Justice Project, based on compiled lists which identify those who were known to have surrendered, estimated that 503 people had been forcibly disappeared between the 17th– 19th of May 2009.
Why top funders back this small human rights organization with a global reach
Eric Sears, a director at the MacArthur Foundation who leads the grantmaker’s Technology in the Public Interest program, worked at Human Rights First and Amnesty International before joining MacArthur, and has been following HRDAG’s work for years. … One of HRDAG’s strengths is the long relationships it maintains with partners around the globe. “HRDAG is notable in that it really develops deep relationships and partnerships and trust with organizations and actors in different parts of the world,” Sears said. “I think they’re unique in the sense that they don’t parachute into a situation and do a project and leave. They tend to stick with organizations and with issues over the long term, and continually help build cases around evidence and documentation to ensure that when the day comes, when accountability is possible, the facts and the evidence are there.”
Press Release, Timor-Leste, February 2006
Weapons of Math Destruction
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives. Excerpt:
As Patrick once explained to me, you can train an algorithm to predict someone’s height from their weight, but if your whole training set comes from a grade three class, and anyone who’s self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn’t the algorithm, it’s the training data and the lack of correction when the model produces erroneous conclusions.
How many people are going to die from COVID-19?
Patrick Ball, Kristian Lum, Tarak Shah and Megan Price (2020). How many people are going to die from COVID-19? Granta. 14 March 2020. © Granta Publications 2020.
How much faith can we place in coronavirus antibody tests?
Megan Price, Morgan Agnew, and David Peters (2020). How much faith can we place in coronavirus antibody tests? Granta. 28 April 2020. © Granta Publications 2020.
Machine learning is being used to uncover the mass graves of Mexico’s missing
“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”
Justice by the Numbers
Wilkerson was speaking at the inaugural Conference on Fairness, Accountability, and Transparency, a gathering of academics and policymakers working to make the algorithms that govern growing swaths of our lives more just. The woman who’d invited him there was Kristian Lum, the 34-year-old lead statistician at the Human Rights Data Analysis Group, a San Francisco-based non-profit that has spent more than two decades applying advanced statistical models to expose human rights violations around the world. For the past three years, Lum has deployed those methods to tackle an issue closer to home: the growing use of machine learning tools in America’s criminal justice system.
South Africa
Lessons at HRDAG: Making More Syrian Records Usable
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.