Hi, this is Kartikay on the cyber team. Every Wednesday, a group of leading cybersecurity experts meet on a Zoom channel to brainstorm ways to maximally disrupt American democracy. Then, they plot out how to dismantle whatever disinformation hydra they created. This process—gaming out destructive scenarios and then tearing them down—is one of the more colorful ways security professionals have found to try to protect the U.S. election coming up in 34 days. Since the 2016 presidential race brought the perils of social media disinformation into the mainstream, reporters and researchers have spent countless hours scrolling through the feeds of groups likely to try to impact the election. Despite the known risks, though, the industry has done little more than simply point out these problems, and social media giants have struggled to corral bad actors' ever-evolving operations. The group of experts meeting on Wednesdays were frustrated at the bulk of their profession simply "admiring the problem," one member said, and wanted to develop tools to fight back against malicious or deceptive online content. This weekly operation, ranging between six and to 20 people, is called the "F--kery Factory." (Their soon-to-be released logo will include the Fs notated for 'fortissimo' in classical music.) It's comprised of a group of security experts mostly tethered to the Cognitive Security Collaborative and the Cyber Threat Intelligence League, two prestigious nonprofits. The group's exercises are almost always rooted in real-world events—the Beirut ammonium nitrate explosion, coronavirus-era school re-openings and anti-mask movements, for example. Each exercise pits two teams against each other—a red team that builds and executes the disinformation campaign and a blue team tasked with identifying and responding to it. During one session in early August, Sara-Jayne Terp, a disinformation expert and chair of the CogSec Collaborative, said her group had gamed out a hypothetical surge in activity around the hashtag #HugsNotMasks, a social media campaign claiming to support the struggle of moms against social distancing. Terp's next big worry involves a campaign to malign temperature scanners being adopted by schools preparing to reopen in parts of the U.S.: "Something about rays coming out of temperature sensors hurting kids…that's going to be a thing," she said. By plotting out such scenarios, "We give the team permission to be evil every week," Terp said. "And then we treat these campaigns like they're malware." Once the group has a disinformation campaign drawn up using machine learning and data analysis, it prepares countermoves to beat it back. The "Factory" has published multiple documents, including one called Adversarial Misinformation and Influence Tactics and Techniques, which is a sort of framework for identifying addressing disinformation available to security professionals, novice researchers and everyone in between. It's rarely surprising when these simulated campaigns end up playing out in the wild, Terp said. But some scenarios have become more prevalent than others. While the group has spent a lot of time focusing on foreign agents, domestic misinformation has recently become more prevalent. Instead of campaigns designed by the Russians to sow division between Trump supporters and Black Lives Matter activists, as happened in 2016, this year Americans are doing that work on their own, as evidenced by the proliferation of QAnon and viral anti-mask content. But Terp allowed that there's still time for foreign interference to pick up before November. "There are just so many sub-narratives at play—all of the protests, violent extremists, everything coronavirus," she said. "There are so many things that could be played out that just aren't. Maybe it's just a matter of time." —Kartikay Mehrotra |
Post a Comment