Information Warfare is Moving Beyond Nation States
Mis- and disinformation ecosystems extend beyond the usual bad state actors.
The United States is dealing with numerous crises simultaneously, including the pandemic, the effects of climate change, supply chain disruptions and inflation, and polarizing, paralyzing partisanship. All of them are being turbocharged by misinformation and disinformation spread at lightning speed through social media and the internet.
A panel of experts convened Jan. 20 by the Aspen Digital program at the Aspen Institute offered a dismal, but not hopeless, portrait of the reasons behind the explosion of bad information and ways to combat the phenomenon.
“We tend to think of this as extremely novel, but it’s not, [it’s] something perennially with us,” said Alicia Wanless, Director of the Partnership for Countering Influence Operations, Carnegie Endowment for International Peace. “We don’t have a good way to look at it holistically,” partly because efforts to address misinformation and disinformation focus on the bad actors promoting them rather than their ecosystems.
Wanless said there are three broad categories of people responsible for this state of affairs: those seeking power, who may be using bad information to mobilize their supporters; proselytizers seeking to gain converts to their beliefs; and those doing it to for profit.
“They may not believe what they’re espousing, they’re in it to make money,” she said.
“We misunderstand what the goals of a misinformation campaign are,” said Garrett Graff, a former journalist who now is a director of cyber initiatives at Aspen Digital. “It’s to create permission structures [for people] to believe what they’re already inclined to believe … Disinformation is a symptom, not the disease,” applied to exploit existing social wedges such as racism.
“Ten years ago, when we started, we had very distinct topic areas,” said Yasmin Green, director of R&D for Jigsaw, a unit within Google that explores threats to open societies. “Over time misinformation and disinformation [has spread] over all of them.”
Green said there has been particular growth in the spread of conspiracy theories. People everywhere “are grappling with world views about not just what’s happening, but why it’s happening,” she said. “Whether it’s for political motivation or profit or to mobilize, [it’s] a human problem, not a technology problem.”
Former Congressman Will Hurd, who was a clandestine officer in the CIA before running for office, said the problem has two parts – foreign and domestic – and the tools and institutions to address it are different in each arena.
“We have a foreign policy apparatus,” Hurd said. “We know who [the foreign violent extremists] are, we know what the messages are. The Russians have been perfecting the use of misinformation and disinformation for decades.”
Domestically, taking steps to fight the problem is much harder. “There’s erosion of trust in all entities – local, state, federal government, the press, the scientific community. Nobody knows where to go and they latch onto others with similar ideas,” Hurd said.
Compounding the domestic challenge, “we always want to talk about the edges in political life,” he said, referring to left- and right-wing extremists. He pointed out that the existing primary system rewards those on the fringes. “Ninety-two percent of Republican primaries decided who the elected official would be, [many of whom] were the most extreme … That’s not the majority of the country.”
Wanless sees three major shifts further driving the problem: that information travels farther and faster than ever before, that anybody can get involved, and data aggregation is giving marketers both the incentive and the opportunity to be ever more provocative to get the engagements they seek.
Green offered a little bit of optimism based on her own experiences.
“The challenge of the Islamic State [recruiting followers online] felt so daunting and formidable,” she said. “Six years on, I look back and think, ‘Wow, that was a solvable problem.’”
What made the difference was the ability to show the falsity of their claims of an Islamist utopia.
“There were long queues of people to get bread, hospitals that weren’t working … we targeted advertising to people sympathetic [to the Islamists] that showed” those images,” she added.
Fact-checkers are valuable, but “that model can’t be scaled to the size of the problem,” Green said. Instead, the organizations focused on fighting misinformation and disinformation should be exploring how to “pre-bunk,” rather than debunk, false claims, she suggested.
“The prescription is to identify the misinformation that have legs, that endure, and get out ahead of them,” Green said.
“I would look at how we [can] have more competitive seats, rather than less,” Hurd said. “It takes away the motivation for some of those edges … There’s not a silver bullet. It’s improving education, it’s improving infrastructure for areas that don’t have” access to the internet and reliable sources of information.
Wanless said that rather than looking at individual dis- and misinformation campaigns, researchers should be looking deeper, at the precursor situations that lay the groundwork for those campaigns.
She noted that the younger generations, who have grown up with internet ubiquity, do not understand how they can be targeted.
“I think there needs to be a better education campaign about what life in the internet age … really is,” Wanless said.
NEXT STORY: The Hill gets serious about digital services