Hey all, it's Kurt. America got a front-row seat this week to the kind of content challenges Facebook Inc. and Twitter Inc. are dealing with on a near-daily basis. President Donald Trump, the country's loudest and most controversial social media user, said Tuesday during a presidential debate that mail-in ballots would lead to widespread fraud, which hasn't been true in the past, setting the stage for a contested election. He claimed that a Covid-19 vaccine was close at hand and that young people aren't very vulnerable to the disease—statements at odds with health experts' advisories. And when Trump was asked explicitly to denounce white supremacism, he wouldn't. "Proud boys, stand back and stand by," he said, seeming to address the far-right hate group directly. The debate provided a real-life example of the types of content and rhetoric social media platforms like Facebook and Twitter see from the president regularly. You've seen iterations of Trump's claims before. Similar ideas have been circulating online for months, from both Trump and his supporters. Facebook and Twitter seem to have as much trouble deciding what to do about them as Tuesday's debate moderator, Chris Wallace of Fox News. Wallace struggled to maintain control as Trump in particular consistently interrupted the debate. "I never dreamt that it would go off the tracks the way it did," he told the New York Times on Wednesday. But the debate performance wasn't just a one-night problem. On stage, Trump normalized a handful of misinformed statements, and seemed to legitimize a known hate group to millions of people watching on TV. Now, Facebook and Twitter are tasked with policing that rhetoric online, a growing challenge as Trump continues funneling those ideas into the mainstream. Facebook in particular has struggled to enforce its policies regarding the president's speech, despite a number of posts that appear to violate the company's rules. Chief Executive Officer Mark Zuckerberg has argued that voters should hear from elected officials for themselves, and that it's not Facebook's role to filter those messages, no matter how inflammatory. On one hand, it feels like Facebook's content decisions mean less than ever. The social network can't control what elected officials say on live TV. Even if Facebook flagged every incendiary Trump post on its service, voters would likely see it one way or another. But you also have to wonder how much of Trump's plan on Tuesday was impacted by Facebook's willingness to let similar arguments go virtually unchecked for months. Perhaps, if those arguments weren't part of the everyday social media discussion, they wouldn't be part of the nationally televised debate, either. Wallace, for one, wishes he'd done things differently. "For me, but much more importantly, I'm disappointed for the country, because it could have been a much more useful evening than it turned out to be," he said later. The feedback Wallace got was the very same feedback Facebook and Twitter often receive: Simply turn off the president's microphone. As misinformation seeps further into the American psyche, it may not be that easy. —Kurt Wagner |
Post a Comment