Support the news
Yoel Roth spends a lot of time thinking about what could go wrong on Twitter. It's his job, as the social media company's head of site integrity.
"Having a vivid imagination is key," he told NPR. "None of the threats are off-limits."
When it comes to the 2020 presidential election, there are lots of ways bad actors could abuse the platform. Roth mentioned a few, from a high-profile hack like Twitter saw in July to attempts by Russia and other countries to use the platform to spread misinformation.
Twitter has gamed out how it would respond to those attacks, as well as many others, as part of the threat modeling exercises it does both on its own and in concert with Facebook, other tech companies and government agencies.
"We really undertook a process to try and predict what the worst-case scenarios were based on what we had seen previously in 2016, 2018 and in elections around the world, as well as some of the things that we thought were likely to happen in the United States this time around," Roth said.
Lately, those exercises have involved another scary scenario: the possibility that, with so many people voting by mail in this election, the results could be delayed.
That uncertainty could create the kind of information vacuum in which conspiracy theories and attempts to undercut democracy would thrive — especially on social media.
"We are on high alert before the election and after the election," Sheryl Sandberg, Facebook's chief operating officer, told NPR's All Things Considered this week. "We are worried about misinformation. We are worried about people claiming election results [prematurely]."
The platforms are already dealing with a lot of falsehoods and misleading posts about the election. Many of those messages are being spread by President Trump. He has repeatedly questioned the legitimacy of the voting process, as he did during this week's debate.
Delayed results would offer even more opportunity to cast doubt on the outcome and to amplify that doubt using social media.
Of course, this would not be the first time Americans have had a long wait to find out who won a presidential race. In 2000, it took 36 days for a winner to emerge in the battle between George W. Bush and Al Gore.
But disinformation expert Clint Watts, who has long studied how Russia interferes in U.S. politics, said the threat in 2020 is different.
"There were some angry lawyers in Bush versus Gore, but it is pretty tame compared to today," he said. And that's not just because of the misleading rhetoric.
"People couldn't be mobilized on social media in this way," he added.
Facebook and Twitter have created new rules against posts that undermine election results. The platforms now say they will label or remove posts that say mail-in ballots are fraudulent, for example, and will not allow anyone to advocate violence to disrupt the peaceful transfer of power. Just in the last few days, Facebook has said it will reject ads that claim premature victory or that try to "delegitimize" the outcome. Google, owner of YouTube, which has become a big destination for political content, said it won't allow any political advertising at all after polls close.
Twitter and Facebook also said they are using both artificial intelligence and human review to stop harmful content from spreading widely.
But stopping messages from going viral runs counter to the way Facebook and Twitter work. The platforms are designed to put posts that attract a lot of attention — often because they are emotional or sensational — in front of lots of people.
With so much at stake, experts said it is not enough for the platforms to announce new rules.
"The major gap now is not in the design of the policies on the platforms. The major gap is in enforcement of those policies," said Chloe Colliver, who studies online extremism at the Institute for Strategic Dialogue in London.
Critics said Facebook's track record, in particular, is not great.
Just this week, former Vice President Joe Biden's campaign called Facebook "the nation's foremost propagator of disinformation about the voting process" because of its decision to label, rather than remove, Trump's posts attacking mail-in ballots.
Facebook insisted it applies its policies fairly. "We are taking a lot of things down," Sandberg said. "We obviously have to let candidates speak."
So to the big question: Will the social networks hold the line in the days after the election? The answer is far from certain.
Editor's note: Facebook and Google are among NPR's financial supporters.
- 'Russia Doesn't Have To Make Fake News': Biggest Election Threat Is Closer To Home
- Rule Changes In Swing States Mean More Votes Will Count, Results May Take Longer
- How Facebook Is Trying To Control Election Disinformation, Part 1
- Can Circuit Breakers Stop Viral Rumors On Facebook, Twitter?
- Civil Rights Groups Say If Facebook Won't Act On Election Misinformation, They Will
Support the news