Hey all, it's Sarah Frier. Vaccines have been widely available in the U.S. for a while. As a result, the digital conversation about them has had time to mature, as have the defenses against vaccine rumors, misinformation and disinformation. Top immunization skeptics have been banned from Facebook Inc. and YouTube for months, as have millions of pieces of content that could lead users to make bad health decisions. But in most of the world, people are still just beginning to learn about the shots, and are waiting to be eligible for them. They're having a very different social media experience. The same content that has already gone viral and been banned in the U.S. is now being adapted to non-English speaking markets, getting translated and spread in other languages by regional news sites and sometimes escaping tech giants' scrutiny. That's either because their artificial intelligence systems are still learning how to solve the problems in that language, or because their fact-checking network simply isn't as strong outside of the U.S. My colleague Daniel Zuidijk and I laid out the problem in Bloomberg Businessweek on Thursday. It makes sense: Facebook, Instagram and Google's YouTube are all based in California, built first in English and have employees that respond to criticism in the English-speaking media. Problems are often solved in the U.S. before companies figure out what to do with the rest of the world—where most of their users reside. For instance, Facebook's machine learning algorithms are trained to find problems now in more than a dozen languages; their product is offered in more than 100. The problem isn't new. Facebook was critiqued in 2018 for helping give rise to ethnic violence in Myanmar, particularly because it didn't have anyone on staff who understood the danger of the rumors being spread on its platform. To this day, the company won't disclose how many moderators it employs in a specific language or region. More recently, the pandemic has presented a unique challenge because it's a global phenomenon, with information traveling across countries and media markets. Still, the big companies are at least taking a swing at the problem. Getting banned from Facebook or YouTube is so common, it's turned into a rallying cry for anti-vaccination activists, who direct their followers to find them in a place without so many rules. Many have taken to Telegram, the messaging app. And more recently, they've made accounts on Rumble, a video site popular with right-wing media personalities in the U.S., which just got an injection of capital from investors including Peter Thiel, who is on Facebook's board. From those networks, they post the same stuff, which gets picked up by bloggers and influencers in Germany, Mexico, Brazil and beyond. Then those personalities share what they're hearing on Facebook, Instagram, Twitter and YouTube, starting the companies' cycle of discovering and removing harmful content all over again. —Sarah Frier |
Post a Comment