YouTube announced Thursday that it is expanding its hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories, like QAnon, that have been used to justify real-world violence.
Why it matters: It is the latest tech giant to crack down on QAnon content, which has seen record online interest in 2020.
- Facebook banned QAnon content across all of its platforms earlier this month.
- Twitter announced a broad crackdown of QAnon content in July.
Catch up quick: QAnon is a sprawling, far-right conspiracy theory that falsely alleges a secret cabal of sex traffickers and pedophiles is waging a war against President Trump from inside the government.
- Trump has previously praised the movement, which the FBI has deemed a potential domestic terrorist threat, saying that he understands its supporters "like me very much" and that they "love America."
The state of play: YouTube said that the policy applies to content that threatens or harasses someone by suggesting they are complicit in harmful conspiracy theories, like QAnon or Pizzagate.
- The tech giant added that news coverage on these issues or content discussing them that doesn't target individuals or protected groups — veterans, people with disabilities, etc. — may stay up.
Between the lines: YouTube also said that its efforts to refine its policies over the past two years have helped to curb QAnon-linked content.
- It said the steps it introduced nearly two years ago to limit the reach of harmful misinformation has resulted in a 80% drop in views of QAnon content via its search and discovery systems.
- The company noted it has removed tens of thousands of QAnon videos and hundreds of channels to date, focusing on those that explicitly threaten violence or "deny the existence of major violent events."
The big picture: Tech platforms have been caught flat-footed trying to manage the spread of conspiracy-linked content on their platforms, because it often does not violate existing hate speech or harassment policies.
- In the wake of calls for violence around the election, Big Tech is trying to get ahead of the broader harm that conspiracy content can have on society.