608 results for search: %7B%EC%95%A0%EC%9D%B8%EB%A7%8C%EB%93%A4%EA%B8%B0%7D www%2Ctada%2Cpw %EC%97%B0%EC%84%B1%EC%83%81%ED%99%A9%EA%B7%B9 %EC%97%B0%EC%84%B1%EC%84%B1%EC%83%81%EB%8B%B4%E2%89%AA%EC%97%B0%EC%84%B1%EC%84%B1%EC%9D%B8%E2%97%87%EC%97%B0%EC%84%B1%EC%84%B1%EC%9D%B8%EC%89%BC%ED%84%B0%E2%92%A9%E3%83%81%E9%B8%97journeyman
Innocence Discovery Lab – Harnessing Large Language Models to Surface Data Buried in Wrongful Conviction Case Documents
Ayyub Ibrahim, Huy Dao, and Tarak Shah (2024). “Innocence Discovery Lab – Harnessing Large Language Models to Surface Data Buried in Wrongful Conviction Case Documents.” The Wrongful Conviction Law Review 5 (1):103-25. https://doi.org/10.29173/wclawr112. 31 May, 2024. Copyright (c) 2024 Ayyub Ibrahim, Huy Dao, Tarak Shah. Creative Commons Attribution 4.0 International License.
Even if there’s a ceasefire, thousands of deaths projected in Gaza over next 6 months
In this NPR story, HRDAG’s Patrick Ball comments on first-of-its-kind projections.
HRDAG Retreat 2018
Unbiased algorithms can still be problematic
“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”
HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.
PredPol amplifies racially biased policing
HRDAG associate William Isaac is quoted in this article about how predictive policing algorithms such as PredPol exacerbate the problem of racial bias in policing.
Why It Took So Long To Update the U.N.-Sponsored Syria Death Count
In this story, Carl Bialik of FiveThirtyEight interviews HRDAG executive director Patrick Ball about the process of de-duplication, integration of databases, and machine-learning in the recent enumeration of reported casualties in Syria.
New reports of old deaths come in all the time, Ball said, making it tough to maintain a database. The duplicate-removal process means “it’s a lot like redoing the whole project each time,” he said.
Nonprofits Are Taking a Wide-Eyed Look at What Data Could Do
In this story about how data are transforming the nonprofit world, Patrick Ball is quoted. Here’s an excerpt: “Data can have a profound impact on certain problems, but nonprofits are kidding themselves if they think the data techniques used by corporations can be applied wholesale to social problems,” says Patrick Ball, head of the nonprofit Human Rights Data Analysis Group.
Companies, he says, maintain complete data sets. A business knows every product it made last year, when it sold, and to whom. Charities, he says, are a different story.
“If you’re looking at poverty or trafficking or homicide, we don’t have all the data, and we’re not going to,” he says. “That’s why these amazing techniques that the industry people have are great in industry, but they don’t actually generalize to our space very well.”