140 results for search: {애인만들기} www,tada,pw 연성상황극 연성성상담≪연성성인◇연성성인쉼터⒩チ鸗journeyman


The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool. ACLU. Summer 2023.

Anjana Samant, Noam Shemtov, Kath Xu, Sophie Beiers, Marissa Gerchick, Ana Gutierrez, Aaron Horowitz, Tobi Jegede, Tarak Shah (2023). The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening Tool. ACLU. Summer 2023.


.outter-wrapper.feature { background: #15795b; } .outter-wrapper.feature hr { border-width: 0; height: 30px; } .outter-wrapper.feature h4 { /* height: 30px; */ border-width: 0; } .wrapper { padding: 20px 0; } .branding-headline { width: 100%; font-size: 40px; font-weight: 600; padding-bottom: 20px; color: #15795b; line-height: 43.2px; } .border-line { border-bottom: 1px solid #000; margin: 20px 0; } .hed-dek-illo { margin: 20px 0; } .illo { width: 100%; min-height: 200px; } .illo img { margin: 0; } .blog-pages { display: flex; } .blog-post { flex: 0 0 ...

Lies, Damned Lies, and “Official” Statistics

Megan Price and Maria Gargiulo (2021). Lies, Damned Lies, and "Official" Statistics. Health and Human Rights Journal. 24 June, 2021. © Health and Human Rights Journal.

Megan Price and Maria Gargiulo (2021). Lies, Damned Lies, and “Official” Statistics. Health and Human Rights Journal. 24 June, 2021. © Health and Human Rights Journal.


The True Dangers of AI are Closer Than We Think

William Isaac is quoted.


Lessons at HRDAG: Making More Syrian Records Usable

If we could glean key missing information from those fields, we would be able to use more records.

A Data Double Take: Police Shootings

“In a recent article, social scientist Patrick Ball revisited his and Kristian Lum’s 2015 study, which made a compelling argument for the underreporting of lethal police shootings by the Bureau of Justice Statistics (BJS). Lum and Ball’s study may be old, but it bears revisiting amid debates over the American police system — debates that have featured plenty of data on the excessive use of police force. It is a useful reminder that many of the facts and figures we rely on require further verification.”


Quantifying Injustice

“In 2016, two researchers, the statistician Kristian Lum and the political scientist William Isaac, set out to measure the bias in predictive policing algorithms. They chose as their example a program called PredPol.  … Lum and Isaac faced a conundrum: if official data on crimes is biased, how can you test a crime prediction model? To solve this technique, they turned to a technique used in statistics and machine learning called the synthetic population.”


What we’ll need to find the true COVID-19 death toll

From the article: “Intentionally inconsistent tracking can also influence the final tally, notes Megan Price, a statistician at the Human Rights Data Analysis Group. During the Iraq War, for example, officials worked to conceal mortality or to cherry pick existing data to steer the political narrative. While wars are handled differently from pandemics, Price thinks the COVID-19 data could still be at risk of this kind of manipulation.”


How do epidemiologists know how many people will get Covid-19?

Patrick Ball (2020). How do epidemiologists know how many people will get Covid-19? Significance. 09 April 2020. © 2020 The Royal Statistical Society.

Patrick Ball (2020). How do epidemiologists know how many people will get Covid-19? Significance. 09 April 2020. © 2020 The Royal Statistical Society.


How many people are infected with Covid-19?

Tarak Shah (2020). How many people are infected with Covid-19? Significance. 09 April 2020. © 2020 The Royal Statistical Society.

Tarak Shah (2020). How many people are infected with Covid-19? Significance. 09 April 2020. © 2020 The Royal Statistical Society.


Counting The Dead: How Statistics Can Find Unreported Killings

Ball analyzed the data reporters had collected from a variety of sources – including on-the-ground interviews, police records, and human rights groups – and used a statistical technique called multiple systems estimation to roughly calculate the number of unreported deaths in three areas of the capital city Manila.

The team discovered that the number of drug-related killings was much higher than police had reported. The journalists, who published their findings last month in The Atlantic, documented 2,320 drug-linked killings over an 18-month period, approximately 1,400 more than the official number. Ball’s statistical analysis, which estimated the number of killings the reporters hadn’t heard about, found that close to 3,000 people could have been killed – more than three times the police figure.

Ball said there are both moral and technical reasons for making sure everyone who has been killed in mass violence is counted.

“The moral reason is because everyone who has been murdered should be remembered,” he said. “A terrible thing happened to them and we have an obligation as a society to justice and to dignity to remember them.”


At Toronto’s Tamil Fest, human rights group seeks data on Sri Lanka’s civil war casualties

Earlier this year, the Canadian Tamil Congress connected with HRDAG to bring its campaign to Toronto’s annual Tamil Fest, one of the largest gatherings of Canada’s Sri Lankan diaspora.

Ravichandradeva, along with a few other volunteers, spent the weekend speaking with festival-goers in Scarborough about the project and encouraging them to come forward with information about deceased or missing loved ones and friends.

“The idea is to collect thorough, scientifically rigorous numbers on the total casualties in the war and present them as a non-partisan, independent organization,” said Michelle Dukich, a data consultant with HRDAG.


The Untold Dead of Rodrigo Duterte’s Philippines Drug War

From the article: “Based on Ball’s calculations, using our data, nearly 3,000 people could have been killed in the three areas we analyzed in the first 18 months of the drug war. That is more than three times the official police count.”


‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley

Kristian Lum, the lead statistician at the Human Rights Data Analysis Group, and an expert on algorithmic bias, said she hoped Stanford’s stumble made the institution think more deeply about representation.

“This type of oversight makes me worried that their stated commitment to the other important values and goals – like taking seriously creating AI to serve the ‘collective needs of humanity’ – is also empty PR spin and this will be nothing more than a vanity project for those attached to it,” she wrote in an email.


The ghost in the machine

“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.


Cifra de líderes sociales asesinados es más alta: Dejusticia

Contrario a lo que se puede pensar, los datos oficiales sobre líderes sociales asesinados no necesariamente corresponden a la realidad y podría haber mucha mayor victimización en las regiones golpeadas por este flagelo, según el más reciente informe del Centro de Estudios de Justicia, Derecho y Sociedad (Dejusticia) en colaboración con el Human Rights Data Analysis Group.


Data Mining on the Side of the Angels

“Data, by itself, isn’t truth.” How HRDAG uses data analysis and statistical methods to shed light on mass human rights abuses. Executive director Patrick Ball is quoted from his speech at the Chaos Communication Congress in Hamburg, Germany.


Rise of the racist robots – how AI is learning all our worst impulses

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.” And the public perception might be that the algorithms are impartial.


Reflections: The G in HRDAG is the Real Fuel

It took me a while to realize I had become part of the HRDAG incubator—at least that’s what it felt like to me—for young data analysts who wanted to use statistical knowledge to make a real impact on human rights debates.

Palantir Has Secretly Been Using New Orleans to Test Its Predictive Policing Technology

One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans’ partnership with Palantir, but he recognized the data-mapping model at the heart of the program. “I think the data they’re using, there are serious questions about its predictive power. We’ve seen very little about its ability to forecast violent crime,” Isaac said.


Our work has been used by truth commissions, international criminal tribunals, and non-governmental human rights organizations. We have worked with partners on projects on five continents.

Donate