Most Americans unaware of foreign intel operations’ scope on social media, State Dept. official says
U.S. Ambassador at Large for Cyberspace and Digital Policy Nate Fick’s assessment comes five months before a high-stakes U.S. presidential election.
The State Department’s top digital and cyber policy official says that most Americans are seemingly unaware of how much of their day-to-day social media content is actually made up of veiled foreign intelligence campaigns.
The stark declaration from U.S. Ambassador at Large for Cyberspace and Digital Policy Nate Fick comes amid a polarized national debate over U.S. foreign policy and an upcoming presidential election where generative AI tools —- which have become a top concern for spy officials — could be used to attempt to turn the tide of election outcomes.
“One thing that strikes me is — after a couple of years in this role — I don’t think most American citizens really viscerally understand how much of the content they see on social platforms is actually a foreign intelligence operation,” Fick said on stage at a Washington Post Live event.
“I just don’t think we viscerally get how much of what we see is bot-generated or foreign intelligence service-generated,” he said, referring to bots deployed on social networks that seek to manipulate public opinion or spread disinformation.
Asked about social media influence escalation connected to Israel’s war in Gaza, he said foreign governments have been working to exacerbate differences “unequivocally across multiple platforms and multiple vectors” without naming specific nations. The New York Times reported Tuesday that Israel financed and conducted an influence campaign last year aimed at U.S. lawmakers and the American public, promoting pro-Israel messaging via fake social media accounts and news sites.
Private sector research has already identified multiple foreign influence operations connected to American election processes, including Chinese operatives deploying fake social media personas that have attempted to appraise U.S. domestic issues and learn what political themes divide voters.
“In my little purview, something that we try to make very clear to Russia, to China [and] to others on a consistent basis is that we view any kind of interference in our democratic process as dangerous, as escalatory and as unacceptable,” he later said.
Officials and researchers fear consumer-facing AI tools or similar offerings available on the dark web will supercharge hackers’ attempts to breach election infrastructure or craft realistic-sounding campaigns to sway voters away from the polls, propelling a federal push to stay ahead of such cyber and disinformation threats as November approaches.
TikTok last week said it cleared a slate of 15 foreign influence campaigns on its platform that were active in the beginning months of this year, including an entity affiliated with China. The popular social media app faces its own scrutiny as a possible covert influence operation with alleged ties to China’s central government. It faces a possible U.S. sale or ban next year under a law passed in April, though the company has legally challenged the move.
OpenAI also recently said it it disrupted covert influence operations that used its AI models for various tasks, including generating short comments and longer articles in multiple languages, creating fake names and bios for social media accounts, conducting open-source research, debugging simple code, and translating and proofreading texts.