698 results for search: %E3%80%88%ED%95%98%EC%95%882%EB%8F%99%EB%8F%99%EC%95%84%EB%A6%AC%E3%80%89%20WWW-MEDA-PW%20%20%EA%B9%80%EC%A0%9C%EB%8C%81%EC%86%8C%EA%B0%9C%ED%8C%85%EC%96%B4%ED%94%8C%20%EA%B9%80%EC%A0%9C%EB%8C%81%EC%86%8C%EC%85%9C%D1%86%EA%B9%80%EC%A0%9C%EB%8C%81%EC%86%94%EB%A1%9C%D1%8B%EA%B9%80%EC%A0%9C%EB%8C%81%EC%88%9C%EC%9C%84%E3%8B%B2%E3%82%87%E8%92%80secretory/feed/rss2/chad-photo-essay
Privacy Policy
Guatemala CIIDH Data
Haiti
Data-driven development needs both social and computer scientists
Excerpt:
Data scientists are programmers who ignore probability but like pretty graphs, said Patrick Ball, a statistician and human rights advocate who cofounded the Human Rights Data Analysis Group.
“Data is broken,” Ball said. “Anyone who thinks they’re going to use big data to solve a problem is already on the path to fantasy land.”
Are journalists lowballing the number of Iraqi war dead?
The Columbia Journalism Review investigates the casualty count in Iraq, more than a decade after the U.S. invasion. HRDAG executive director Patrick Ball is quoted. “IBC is very good at covering the bombs that go off in markets,” said Patrick Ball, an analyst at the Human Rights Data Analysis Group who says his whole career is to study “people being killed.” But quiet assassinations and military skirmishes away from the capital often receive little or no media attention.
Truth Commissioner
From the Guatemalan military to the South African apartheid police, code cruncher Patrick Ball singles out the perpetrators of political violence.
Syria’s status, the migrant crisis and talking to ISIS
In this week’s “Top Picks,” IRIN interviews HRDAG executive director Patrick Ball about giant data sets and whether we can trust them. “No matter how big it is, data on violence is always partial,” he says.
Testimonials
Hat-Tip from Guatemala Judges on HRDAG Evidence
Big Data and Death at UW-Madison
Quantifying Police Misconduct in Louisiana
HRDAG Drops Dropbox
Here’s how an AI tool may flag parents with disabilities
HRDAG contributed to work by the ACLU showing that a predictive tool used to guide responses to alleged child neglect may forever flag parents with disabilities. “These predictors have the effect of casting permanent suspicion and offer no means of recourse for families marked by these indicators,” according to the analysis from researchers at the ACLU and the nonprofit Human Rights Data Analysis Group. “They are forever seen as riskier to their children.”
Perú
Liberian Truth and Reconciliation Commission Data
Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice
“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.
I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”
Using statistics to estimate the true scope of the secret killings at the end of the Sri Lankan civil war
In the last three days of the Sri Lankan civil war, as thousands of people surrendered to government authorities, hundreds of people were put on buses driven by Army officers. Many were never seen again.
In a report released today (see here), the International Truth and Justice Project for Sri Lanka and the Human Rights Data Analysis Group showed that over 500 people were disappeared on only three days — 17, 18, and 19 May.
How statistics lifts the fog of war in Syria
Megan Price, director of research, is quoted from her Strata talk, regarding how to handle multiple data sources in conflicts such as the one in Syria. From the blogpost:
“The true number of casualties in conflicts like the Syrian war seems unknowable, but the mission of the Human Rights Data Analysis Group (HRDAG) is to make sense of such information, clouded as it is by the fog of war. They do this not by nominating one source of information as the “best”, but instead with statistical modeling of the differences between sources.”
Courts and police departments are turning to AI to reduce bias, but some argue it’ll make the problem worse
Kristian Lum: “The historical over-policing of minority communities has led to a disproportionate number of crimes being recorded by the police in those locations. Historical over-policing is then passed through the algorithm to justify the over-policing of those communities.”