As the 2024 U.S. presidential election looms, U.S. intelligence officials have raised alarms about Russia's increasingly sophisticated efforts to influence American voters through social media. Recent criminal charges and intelligence briefings reveal that Russia is covertly enlisting American social media influencers, some unknowingly, to disseminate narratives that align with Kremlin interests, marking a new chapter in foreign interference tactics.
The Justice Department recently indicted two former employees of the Russian state media outlet Russia Today (RT), accusing them of funneling approximately $10 million to a conservative media company in Tennessee, known as Tenet Media. The funds were allegedly used to pay American social media stars to create and share content, some of which included anti-Ukraine messaging. "What we see them doing is relying on witting and unwitting Americans to seed, promote, and add credibility to narratives that serve these foreign actors' interests," a senior U.S. intelligence official said.
The indictment details how Tenet Media, which bills itself as a home for "fearless voices," managed a YouTube channel and various social media profiles. The company's founders, Lauren Chen and Liam Donovan, are accused of knowingly accepting Russian money while the influencers they paid were unaware of the source. The scheme mirrors Cold War-era propaganda strategies, but with a modern twist-using influencers instead of journalists to amplify the Kremlin's message. Renee DiResta, a digital disinformation analyst, noted, "This is sort of a digital update to that."
This incident is part of a broader pattern of foreign influence operations targeting the 2024 election. U.S. authorities have identified at least four significant operations by foreign actors, including Russia, Iran, and China. Each country has distinct goals: Russia aims to support former President Donald Trump, Iran seeks to undermine Trump's candidacy, and China is focused on diminishing U.S. support for democracy rather than backing a specific candidate.
One of Russia's other operations, codenamed Doppelganger, involves creating fake news websites that mimic legitimate Western outlets to spread disinformation. These sites have been used to promote false narratives about U.S. political candidates and the war in Ukraine. The Justice Department recently seized 32 domains associated with the Doppelganger operation, highlighting the ongoing threat posed by such tactics.
Despite the sophisticated nature of these operations, their effectiveness remains questionable. Emerson Brooking, director of strategy at the Atlantic Council's Digital Forensic Research Lab, cautioned against overstating their impact. "Even when the numbers sound big, as a share of election-related activity and discourse in the United States, it is a drop in the ocean," Brooking said. He emphasized that while these efforts are concerning, there is little evidence to suggest they have significantly swayed voter behavior.
The challenges of combating foreign influence in the digital age are compounded by the evolution of these tactics. Countries like China have even turned to artificial intelligence (AI) to generate fake personas and content. A recent report by social media analytics firm Graphika uncovered new accounts tied to China's "Spamouflage" campaign, which has been active for years and continues to evolve.
In response to these threats, the U.S. intelligence community has ramped up its efforts to counter foreign influence. The Office of the Director of National Intelligence's (ODNI) Foreign Malign Influence Center plays a key role in identifying and mitigating these operations. However, experts like Olga Belogolova, director of the Emerging Technologies Initiative at Johns Hopkins School of Advanced International Studies, stress the difficulty of measuring the true impact of these campaigns. "That is very difficult to measure," she said, referring to the challenge of linking online propaganda to changes in voter behavior.