645 results for search: %E5%A4%A7%E5%85%AC%E5%8F%B8%E7%9A%84%E4%BC%98%E5%8A%BF%E5%92%8C%E5%8A%A3%E5%8A%BF-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E5%90%8C%E6%B2%BB%E6%AF%94%E5%85%89%E7%BB%AA%E5%A4%A7%E5%87%A0%E5%B2%81-%E5%A4%A7%E5%85%AC%E5%8F%B8%E7%9A%84%E4%BC%98%E5%8A%BF%E5%92%8C%E5%8A%A3%E5%8A%BFzo9xn-%E3%80%90%E2%9C%94%EF%B8%8F%E6%8E%A8%E8%8D%90KK37%C2%B7CC%E2%9C%94%EF%B8%8F%E3%80%91-%E5%90%8C%E6%B2%BB%E6%AF%94%E5%85%89%E7%BB%AA%E5%A4%A7%E5%87%A0%E5%B2%81p2dn-%E5%A4%A7%E5%85%AC%E5%8F%B8%E7%9A%84%E4%BC%98%E5%8A%BF%E5%92%8C%E5%8A%A3%E5%8A%BFblgpy-%E5%90%8C%E6%B2%BB%E6%AF%94%E5%85%89%E7%BB%AA%E5%A4%A7%E5%87%A0%E5%B2%81r26q/feed/rss2/chad-photo-essay
What we’ll need to find the true COVID-19 death toll
From the article: “Intentionally inconsistent tracking can also influence the final tally, notes Megan Price, a statistician at the Human Rights Data Analysis Group. During the Iraq War, for example, officials worked to conceal mortality or to cherry pick existing data to steer the political narrative. While wars are handled differently from pandemics, Price thinks the COVID-19 data could still be at risk of this kind of manipulation.”
A look at the top contenders for the 2022 Nobel Peace Prize
The Washington Post’s Paul Schemm recognized HRDAG’s work in Syria, in the category of research and activism. “HRDAG gained renown at the start of the war, when it was one of the few organizations that tried to put a number on the war’s enormous toll in Syrian lives.”
The ghost in the machine
“Every kind of classification system – human or machine – has several kinds of errors it might make,” [Patrick Ball] says. “To frame that in a machine learning context, what kind of error do we want the machine to make?” HRDAG’s work on predictive policing shows that “predictive policing” finds patterns in police records, not patterns in occurrence of crime.
Sobre fosas clandestinas, tenemos más información que el gobierno: Ibero
El modelo “puede distinguir entre los municipios en que vamos a encontrar fosas clandestinas, y en los que es improbable que vayamos a encontrar estas fosas”, explicó Patrick Ball, estadístico estadounidense que colabora con el Programa de Derechos Humanos de la Universidad Iberoamericana de la Ciudad de México.
Trump’s “extreme-vetting” software will discriminate against immigrants “Under a veneer of objectivity,” say experts
Kristian Lum, lead statistician at the Human Rights Data Analysis Group (and letter signatory), fears that “in order to flag even a small proportion of future terrorists, this tool will likely flag a huge number of people who would never go on to be terrorists,” and that “these ‘false positives’ will be real people who would never have gone on to commit criminal acts but will suffer the consequences of being flagged just the same.”
The Data Scientist Helping to Create Ethical Robots
Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs.
What’s the relationship between statistics and AI and machine learning?
AI seems to be a sort of catchall for predictive modeling and computer modeling. There was this great tweet that said something like, “It’s AI when you’re trying to raise money, ML when you’re trying to hire developers, and statistics when you’re actually doing it.” I thought that was pretty accurate.
Megan Price: Life-Long ‘Math Nerd’ Finds Career in Social Justice
“I was always a math nerd. My mother has a polaroid of me in the fourth grade with my science fair project … . It was the history of mathematics. In college, I was a math major for a year and then switched to statistics.
I always wanted to work in social justice. I was raised by hippies, went to protests when I was young. I always felt I had an obligation to make the world a little bit better.”
Counting The Dead: How Statistics Can Find Unreported Killings
Ball analyzed the data reporters had collected from a variety of sources – including on-the-ground interviews, police records, and human rights groups – and used a statistical technique called multiple systems estimation to roughly calculate the number of unreported deaths in three areas of the capital city Manila.
The team discovered that the number of drug-related killings was much higher than police had reported. The journalists, who published their findings last month in The Atlantic, documented 2,320 drug-linked killings over an 18-month period, approximately 1,400 more than the official number. Ball’s statistical analysis, which estimated the number of killings the reporters hadn’t heard about, found that close to 3,000 people could have been killed – more than three times the police figure.
Ball said there are both moral and technical reasons for making sure everyone who has been killed in mass violence is counted.
“The moral reason is because everyone who has been murdered should be remembered,” he said. “A terrible thing happened to them and we have an obligation as a society to justice and to dignity to remember them.”
Cifra de líderes sociales asesinados es más alta: Dejusticia
Contrario a lo que se puede pensar, los datos oficiales sobre líderes sociales asesinados no necesariamente corresponden a la realidad y podría haber mucha mayor victimización en las regiones golpeadas por este flagelo, según el más reciente informe del Centro de Estudios de Justicia, Derecho y Sociedad (Dejusticia) en colaboración con el Human Rights Data Analysis Group.
Machine learning is being used to uncover the mass graves of Mexico’s missing
“Patrick Ball, HRDAG’s Director of Research and the statistician behind the code, explained that the Random Forest classifier was able to predict with 100% accuracy which counties that would go on to have mass graves found in them in 2014 by using the model against data from 2013. The model also predicted the counties that did not have mass hidden graves found in them, but that show a high likelihood of the possibility. This prediction aspect of the model is the part that holds the most potential for future research.”
Hunting for Mexico’s mass graves with machine learning
“The model uses obvious predictor variables, Ball says, such as whether or not a drug lab has been busted in that county, or if the county borders the United States, or the ocean, but also includes less-obvious predictor variables such as the percentage of the county that is mountainous, the presence of highways, and the academic results of primary and secondary school students in the county.”
Unbiased algorithms can still be problematic
“Usually, the thing you’re trying to predict in a lot of these cases is something like rearrest,” Lum said. “So even if we are perfectly able to predict that, we’re still left with the problem that the human or systemic or institutional biases are generating biased arrests. And so, you still have to contextualize even your 100 percent accuracy with is the data really measuring what you think it’s measuring? Is the data itself generated by a fair process?”
HRDAG Director of Research Patrick Ball, in agreement with Lum, argued that it’s perhaps more practical to move it away from bias at the individual level and instead call it bias at the institutional or structural level. If a police department, for example, is convinced it needs to police one neighborhood more than another, it’s not as relevant if that officer is a racist individual, he said.