707 results for search: %EB%82%A8%EA%B5%AC%ED%9C%B4%EA%B2%8C%ED%85%94%E3%81%ACjusobot%E3%80%81%EF%BD%830m%E2%99%99%EB%82%A8%EA%B5%AC%EA%B1%B4%EB%A7%88%E2%99%A9%EB%82%A8%EA%B5%AC%EC%97%85%EC%86%8C%E2%9C%88%EB%82%A8%EA%B5%AC%EB%A6%BD%EB%B0%A9%20%EB%82%A8%EA%B5%AC%EB%8B%AC%EB%A6%BC/feed/content/colombia/SV-report_2011-04-26.pdf
Welcoming Our New Data Scientist
A better statistical estimation of known Syrian war victims
Researchers from Rice University and Duke University are using the tools of statistics and data science in collaboration with Human Rights Data Analysis Group (HRDAG) to accurately and efficiently estimate the number of identified victims killed in the Syrian civil war.
…
Using records from four databases of people killed in the Syrian war, Chen, Duke statistician and machine learning expert Rebecca Steorts and Rice computer scientist Anshumali Shrivastava estimated there were 191,874 unique individuals documented from March 2011 to April 2014. That’s very close to the estimate of 191,369 compiled in 2014 by HRDAG, a nonprofit that helps build scientifically defensible, evidence-based arguments of human rights violations.
Multiple Systems Estimation: The Basics
Recognising Uncertainty in Statistics
In Responsible Data Reflection Story #7—from the Responsible Data Forum—work by HRDAG affiliates Anita Gohdes and Brian Root is cited extensively to make the point about how quantitative data are the result of numerous subjective human decisions. An excerpt: “The Human Rights Data Analysis Group are pioneering the way in collecting and analysing figures of killings in conflict in a responsible way, using multiple systems estimation.”
HRDAG – 25 Years and Counting
Counting the Unknown Victims of Political Violence: The Work of the Human Rights Data Analysis Group
Ann Harrison (2012). Counting the Unknown Victims of Political Violence: The Work of the Human Rights Data Analysis Group, in Human Rights and Information Communications Technologies: Trends and Consequences of Use. © 2012 IGI Global. All rights reserved.
On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations
Romesh Silva. “On ensuring a higher level of data quality when documenting human rights violations to support research into the origins and cause of human rights violations.” ASA Proceedings of the Joint Statistical Meetings, the International Biometric Society (ENAR and WNAR), the Institute of Mathematical Statistics, and the Statistical Society of Canada. August, 2002.
Different Convenience Samples, Different Stories: The Case of Sierra Leone.
Anita Gohdes. “Different Convenience Samples, Different Stories: The Case of Sierra Leone.” Benetech. 2010. © 2010 Benetech. Creative Commons BY-NC-SA.
Data Science Symposium at Vanderbilt
Sierra Leone TRC Data and Statistical Appendix
Welcome!
Reflections: Some Stories Shape You
Momentous Verdict against Hissène Habré
Perú
Big Data Predictive Analytics Comes to Academic and Nonprofit Institutions to Fuel Innovation
Using Math and Science to Count Killings in Syria
Welcoming a New Board Member
Celebrating Women in Statistics
In her work on statistical issues in criminal justice, Lum has studied uses of predictive policing—machine learning models to predict who will commit future crime or where it will occur. In her work, she has demonstrated that if the training data encodes historical patterns of racially disparate enforcement, predictions from software trained with this data will reinforce and—in some cases—amplify this bias. She also currently works on statistical issues related to criminal “risk assessment” models used to inform judicial decision-making. As part of this thread, she has developed statistical methods for removing sensitive information from training data, guaranteeing “fair” predictions with respect to sensitive variables such as race and gender. Lum is active in the fairness, accountability, and transparency (FAT) community and serves on the steering committee of FAT, a conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.
Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology
One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.