Facebook said Monday that it will no longer accept new political ads for the week leading up to the U.S. election. It will also label posts from candidates who claim victory prematurely and will direct users to the official results.
Why it matters: It's the most aggressive effort Facebook has made to date to curb manipulation in the days leading up to the U.S. election.
Details: The company says it will ban new ads — both political and issue ads — for the campaign's final week because in the final days of an election, there may not be enough time to contest misleading claims. Facebook also won't allow the content of existing ads to be edited.
- Political and issue advertisers will be able to continue running ads they started placing before the final week leading up to Election Day, and they’ll be able to adjust the targeting of those ads, but they can't alter the content in the ads.
- The company will also add an informational label to content that seeks to delegitimize the outcome of the election. For example, a post saying lawful methods of voting may lead to fraud would be labeled. This label will provide facts about the integrity of the election and voting methods.
- To reduce the risk of misinformation going viral, it will limit forwarding on Facebook Messenger, something the company has already started doing with its other messaging service, WhatsApp.
The tech giant will also remove posts containing misinformation about both COVID-19 and voting — the strongest step it's yet taken to curb misinformation on the virus. It will attach a link to authoritative information about COVID-19 to posts that might use COVID-19 to discourage voting.
If any candidate or campaign tries to declare victory before the final results are in, the company says it will add a label to their posts, directing users to official results as reported by Reuters.
- It will do the same to any posts that try to delegitimize the outcome of the election.
The company says it will work closely with election officials to remove misinformation about voting.
- It says it's going to partner with state election authorities to identify and remove false claims about polling conditions in the last 72 hours of the campaign through the voting and until the election officials have determine a clear winner.
The big picture: Experts fear a tidal wave of misinformation on Facebook and other tech platforms before, during and after the election could impact the public's trust and participation in the democratic process.
- Efforts to manipulate the 2016 election by foreign and domestic actors went largely unnoticed four years ago. Now, tech companies and national security experts are trying desperately to get ahead of potential threats to election integrity.
What's next: Facebook says it will put authoritative information from its Voting Information Center at the top of Facebook and Instagram "starting soon" through Election Day.
- This will include video tutorials on how to vote by mail and information on deadlines for registering and voting in your state.
- Messages in the Voting Information Center will prepare visitors for the possibility that official results may take time to be tallied and might not be available election night.