Hey everyone, it's Sarah Frier. This week a Joe Biden campaign spokesman said Facebook was "shredding the fabric of our democracy." It could have been much worse. Facebook Inc., along with the other major social networks, enacted sweeping changes to its content moderation policies in the run-up to the U.S. election. The company temporarily down-ranked content most likely to be false, meaning it made users less likely to stumble on it in their news feeds. It flagged viral stories automatically to fact-checkers, halted political ads, suspended many hashtags and limited the virality of live videos about politics. On Wednesday, Facebook said that the period of possible civil unrest that necessitated all these restrictions is not over yet. The company will continue to limit political ads—including for the key Georgia runoff elections that could determine control of the Senate—for another month. That's also the timeline for the presidential election results to be certified by most states. "We're temporarily extending a number of measures we put in place to protect the election process," said Rob Leathern, a director of product management at Facebook, in a tweet. Again, Facebook described these measures as temporary. But by most metrics, they worked pretty well. There has been little evidence so far of any successful campaign by foreign agents to undermine the election and few coordinated attempts to intimidate voters. It was an improvement over 2016, when Russian actors enjoyed wide reach on Facebook in their attempts to influence or deter votes. So it's worth asking: If turning off some of your product's features results in less societal chaos, why would you ever turn them back on? Why does it only make sense to calm the masses when the masses have come to a boiling point—perhaps in part because of those very features? Over and over, we've learned that Facebook's groups recommendation algorithm has the potential to radicalize and recruit conspiracy theorists, and that the content that goes viral fastest is often the most scandalous and misleading. We've seen world leaders use the site to sidestep the media with viral propaganda, as President Donald Trump did after President-elect Biden's victory was called. The new policies can't claim to correct for all of this, but experts say they have made a real difference in recent weeks. A Facebook spokesperson said "no changes are imminent" to the plan to retire the extra safeguards. And it's impossible to know how much worse the political climate would be right now without them. But even if the current national mood gets a lot calmer, it's not hard to imagine that flipping the switch back at Facebook could eventually invite a new kind of chaos. —Sarah Frier |
Post a Comment