09 December 2020
Data: NewsWhip; Chart: Andrew Witherspoon/Axios
Facebook and other big online platforms insist they're removing more and more misinformation. But they can't say whether they're actually stemming the tide of lies, and neither can we, because the deluge turns out to be impossible to define or measure.
Why it matters: The tech companies mostly won't share data that would let researchers better track the scale, spread and impact of misinformation. So the riddle remains unsolved, and the platforms can't be held accountable.
Where it stands: Quantifying the volume of misinformation online requires three things: a clear definition of the stuff you're measuring; a way to divide it into countable units; and the ability to see into vast pools of online data to find and enumerate it.
The catch: Different people disagree over which pieces of information are actually misinformation.
- The other catch: Big platforms are reluctant to share data, because releasing it could help competitors and might violate users' privacy rights.
We're not entirely in the dark, and have some simple but useful tools. Among them:
- Google Trends measures the total volume of searches for a given term and can capture the tipping point when false narratives break out into the mainstream.
- NewsWhip gauges the attention particular topics are receiving by measuring the social media interactions — Facebook and Twitter likes and shares, for example — that news stories and other links about them garner. (It's how we built our chart above.)
Yes, but: Such methods provide just a small part of the picture — and nothing about whether the people clicking on misinformation are actually buying it.
- "Measurement of reach alone never tells you the story of where it is that [misinformation is] having influence," said Camille François, the chief information officer of social media analysis firm Graphika, which has proposed its own scale for measuring misinformation-incident severity.
Between the lines: So-called "super spreaders" of misinformation — such as President Trump and his family, who hoist large volumes of often obscure misinformation to large followings — play a big role here. But they're far from the only factor.
The big picture: Experts Axios talked with point to several big problems with existing methods of quantifying misinformation.
1. The numbers that are available are incomplete and potentially misleading.
- Twitter and Facebook have offered snapshots of how much material they've taken down around certain topics, but not the total volume of material they're reviewing.
- Observers aren't sold on relying on the platforms' own assessments. "They make mistakes in both directions," said Ian Vandewalker, who focuses on influence and disinformation campaigns as senior counsel for the Democracy Program at the Brennan Center for Justice.
- "They're using algorithms, so they miss a lot of things, and they also have a lot of false positives."
2. The public internet is only one stream in the broader misinformation deluge.
- False claims and conspiracy theories are increasingly being spread in private Facebook groups, private chat servers on platforms like Discord, private texts and messaging groups. They also surface in partisan media outlets, elected officials' public statements and everyday real-world conversation.
- "It is in all types of spaces," said Amy Mitchell, director of journalism research at Pew Research Center. "This is not just a social media phenomenon."
3. "Misinformation" can be a subjective category.
- Something like "5G towers spread COVID-19" is an easily adjudicated false claim. But most misinformation appears in shades of gray, coming as a misleading gloss on events or statistics with some basis in reality.
- Claims that downplay the threat posed by the coronavirus, for instance, often leave out the fact that it can be more deadly to people with pre-existing conditions — and that many people who have survived the virus are suffering long-term health complications.
- And the language of misinformation is often innuendo and obfuscation — vague allusions to conspiracies and malfeasances rather than bald-faced lies.
What's next: To measure misinformation, we may need to think less about counting grains of sand and focus more on following currents. That would mean a greater effort on classifying and tracking the communities discussing topics linked to misinformation.
- Yonder, an artificial intelligence startup that monitors mis- and disinformation, is one of a number of companies and research groups doing just that.
- Understanding who's driving discussion of a topic can serve as a shortcut for individuals to judge its merits without relying on platform enforcement or transparency.
- "We need to empower the users who consume this information with more information about the agenda behind the groups that are promoting it," Yonder CEO Jonathon Morgan told Axios.
Transcripts show George Floyd told police "I can't breathe" over 20 times
Section2Newly released transcripts of bodycam footage from the Minneapolis Police Department show that George Floyd told officers he could not breathe more than 20 times in the moments leading up to his death.
Why it matters: Floyd's killing sparked a national wave of Black Lives Matter protests and an ongoing reckoning over systemic racism in the United States. The transcripts "offer one the most thorough and dramatic accounts" before Floyd's death, The New York Times writes.
The state of play: The transcripts were released as former officer Thomas Lane seeks to have the charges that he aided in Floyd's death thrown out in court, per the Times. He is one of four officers who have been charged.
- The filings also include a 60-page transcript of an interview with Lane. He said he "felt maybe that something was going on" when asked if he believed that Floyd was having a medical emergency at the time.
What the transcripts say:
- Floyd told the officers he was claustrophobic as they tried to get him into the squad car.
- The transcripts also show Floyd saying, "Momma, I love you. Tell my kids I love them. I'm dead."
- Former officer Derek Chauvin, who had his knee on Floyd's neck for over eight minutes, told Floyd, "Then stop talking, stop yelling, it takes a heck of a lot of oxygen to talk."
Read the transcripts via DocumentCloud.