Hey all, it's Sarah Frier. I remember the first time the U.S. Senate grilled tech executives about the Russian misinformation that spread on their sites ahead of the 2016 election. The hearing was on Halloween a year later, and senators were especially scared about Russia's lies around how to vote. They brought images of social media posts telling people to vote via Twitter, or via text message—neither of which work in U.S. elections. Whether or not it succeeded in tricking people, the lawmakers said, the attempt at suppression was a threat to the democratic process. By the 2018 midterms, Facebook had a solution. Voting misinformation could be reported and removed or fact-checked. The company, which had become a poster child for election interference attempts, touted the move in the press at the time. The signal: Facebook is taking its responsibility seriously. Now, the broader conversation about truth on social media has shifted to Twitter Inc., which started planning misinformation labelling more recently. But instead of Russia spreading inaccurate information on voting, this time, it's President Trump. This week, Trump said mail-in ballots would be "substantially fraudulent." Twitter, for the first time, appended a subtle fact-check note on two of his tweets. The move caused a firestorm of rage from Trump and his supporters, directly attacking the company and one of its employees. On Wednesday the White House said Trump would soon sign an executive order on social media. Largely missing from the hubbub, though, is Facebook. Trump said the same thing about mail-in ballots in a Facebook post, and the company allowed it to go viral without fact-checking notation. In an interview on Fox, Mark Zuckerberg said "I just believe strongly that Facebook shouldn't be the arbiter of truth of everything that people say online." And in a statement, the company said it focuses on posts that would "interfere with the vote." Facebook has largely avoided criticism so far. Those same senators who pushed tech companies to answer for 2016 voter misinformation, have yet to call on the company to take fresh action. Facebook has been clear that it won't fact-check politicians. But there are certain kinds of speech—such as harmful coronavirus misinformation, or voter suppression—that the company has said it will always take down. And while it's debatable whether Trump's latest posts were voter suppression, it's possible that future posts from the president could walk even closer to the company's line in the sand. As we barrel closer to the November election, Facebook will no doubt have more opportunities to decide what to do when the types of posts it usually bans come from the same people who it usually allows to lie. Facebook is already in the U.S. government's crosshairs as the Federal Trade Commission and the Department of Justice investigate possible anticompetitive behavior. Taking action on a Trump post would be riskier for Facebook than it would be for Twitter—Trump's favorite platform, which doesn't appear to be under investigation at the moment. But even Twitter is walking a delicate line. And after this week's blowback, Facebook executives might go to great lengths not to provoke Trump, leaving its users to determine the facts for themselves. —Sarah Frier |
Post a Comment