162 results for search: 〔성인폰팅〕 www.pane.pw 고양데이팅 고양동호회▨고양랜덤폰팅↘고양리스트➁み朹schoolage
We’re happy to announce that our executive director, Patrick Ball, has been presented an honorary degree from Claremont Graduate University in Claremont, California. University President Deborah Freund presented the degree to Patrick at the university’s 88th annual commencement ceremony on Saturday, May 16, 2015. The degree conferred was Doctor of Science honoris causa.
“We at CGU are thrilled that Patrick Ball accepted our Honorary Degree invitation and joined us for commencement,” said Thomas Horan, CGU Professor and Director, Center for Information Systems and Technology. “Patrick’s work stands as a model for conducting first-r...
Upcoming Talks
TBA
Past Talks
2015
Presentation on the research behind the Evaluation of the Kosovo Memory Book Database. National Archive, Pristina, Kosovo. Patrick Ball. February 4, 2015.
How do we know what we know? Patrick Ball. Arizona State University. January, 2015.
AAAS Science & Human Rights Coalition Meeting: Big Data & Human Rights. Megan Price, panelist. Washington, D.C. January 15-16, 2015.
Examining the Crisis in Syria: Conference Hosted by New America and Arizona State University’s Center on the Future of War and the Walter Cronkite School of Journalism and Mass Communication. Megan Price, panelist. Washingt...
On September 7, 2018, Kristian Lum and Patrick Ball participated in a panel at Disrupt San Francisco by TechCrunch. The talk was titled "Dismantling Algorithmic Bias." Brian Brackeen of Kairos was part of the panel as well, and the talk was moderated by TechCrunch reporter Megan Rose Dickey.
From the TechCrunch website, "Disrupt is a 3-day conference focused on breaking technology news and developments with big-name thought leaders who are making waves in the industry."
Video of the talk is available here, and Megan Rose Dickey's coverage is here.
SILICON VALLEY GROUP USES TECHNOLOGY TO HELP THE TRUTH COMMISSION ANSWER DISPUTED QUESTIONS ABOUT MASSIVE POLITICAL VIOLENCE IN TIMOR-LESTE
Palo Alto, CA, February 9, 2006 – The Benetech® Initiative today released a statistical report detailing widespread and systematic violations in Timor-Leste during the period 1974-1999. Benetech's statistical analysis establishes that at least 102,800 (+/- 11,000) Timorese died as a result of the conflict. Approximately 18,600 (+/- 1000) Timorese were killed or disappeared, while the remainder died due to hunger and illness in excess of what would be expected due to peacetime mortality.
The magnitude of deaths ...
Last month Significance magazine published an article on the topic of predictive policing and police bias, which I co-authored with William Isaac. Since then, we've published a blogpost about it and fielded a few recurring questions. Here they are, along with our responses.
Do your findings still apply given that PredPol uses crime reports rather than arrests as training data?
Because this article was meant for an audience that is not necessarily well-versed in criminal justice data and we were under a strict word limit, we simplified language in describing the data. The data we used is a version of the Oakland Police Department’s crime report...
ITJP and HRDAG are urging groups inside and outside Sri Lanka to share existing casualty lists.
HRDAG is identifying and interpreting the best science we can find to shed light on the global crisis brought on by the novel coronavirus, about which we still know so little. Right now, most of the data on the virus SARS-CoV-2 and Covid-19, the condition caused by the virus, are incomplete and unrepresentative, which means that there is a great deal of uncertainty. But making sense of imperfect datasets is what we do. HRDAG is contributing to a better understanding with explainers, essays, and original research, and we are highlighting trustworthy resources for those who want to dig deeper.
Papers and articles by HRDAG
.ugb-bbeb275 .ugb-blo...
HRDAG associate Miguel Cruz has an epiphany. All those data he’s drowning in? Each datapoint is a personal tragedy, a story both dark and urgent, and he’s privileged to have access.
On the anniversary of the Universal Declaration of Human Rights, HRDAG executive director Megan Price tells us why she loves her work, and why she feels hopeful about the future.
Congratulations to Patrick on this well deserved award!
Cory Doctorow of Boing Boing writes about HRDAG executive director Patrick Ball and his contribution to Carl Bialik's article about the recently released Bureau of Justice Statistics report on the number of annual police killings, both reported and unreported, in 538 Politics. Doctorow writes:
Patrick Ball and the Human Rights Data Analysis Group applied the same statistical rigor that he uses in estimating the scale of atrocities and genocides for Truth and Reconciliation panels in countries like Syria and Guatemala to the problem of estimating killing by US cops, and came up with horrific conclusions.
Ball was responding to a set of new estima...
When Patrick Ball was introduced to Ibero’s database, the director of research at the Human Rights Data Analysis Group in San Francisco, California, saw an opportunity to turn the data into a predictive model. Ball, who has used similar models to document human rights violations from Syria to Guatemala, soon invited Data Cívica, a Mexico City–based nonprofit that creates tools for analyzing data, to join the project.
Megan Price (2017). Estimating the human toll in Syria. Nature. 8 February 2017. © 2017 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. Nature Human Behaviour. ISSN 2397-3374.
Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.
Laurel Eckhouse (2017). Big data may be reinforcing racial bias in the criminal justice system. Washington Post. 10 February 2017. © 2017 Washington Post.
Kristian Lum’s work on the HRDAG Policing Project is referred to here: “In fact, Lum argues, it’s not clear how well this model worked at depicting the situation in Oakland. Those data on drug crimes were biased, she now reports. The problem was not deliberate, she says. Rather, data collectors just missed some criminals and crime sites. So data on them never made it into her model.”
Work by HRDAG researchers Kristian Lum and William Isaac is cited in this article about the Policing Project: “While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.”
Kristian Lum discusses the challenges of getting accurate data from conflict zones, as well as her concerns about predictive policing if law enforcement gets it wrong.
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
“Kristian Lum… organized a workshop together with Elizabeth Bender, a staff attorney for the NY Legal Aid Society and former public defender, and Terrence Wilkerson, an innocent man who had been arrested and could not afford bail. Together, they shared first hand experience about the obstacles and inefficiencies that occur in the legal system, providing valuable context to the debate around COMPAS.”
“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.