This week, Meta reported having intervened to thwart two alleged Russian- and Chinese-led cyber "influence operations."
The influence campaign, which was based in Russia, primarily targeted the UK, France, Italy, and Germany. Beginning in May, a network of more than 60 websites in Europe posing as reputable news outlets published original stories denouncing Ukraine and lobbying against Western sanctions on Russia, according to Meta.
The group would promote articles, YouTube videos, and memes on sites including Instagram, Facebook, Twitter, Telegram, and Change.org, it added.
According to Meta, the Russian operation was unique in that it used a variety of attacks of various complexity, with the goal of performing a "smash and grab" against the company's information environment. While the fabricated articles and websites took time and attention to detail, the social media boosting "took a brute-force approach."
"The spoofed websites and the use of many languages demanded both technical and linguistic investment," Meta said. "The amplification on social media, on the other hand, relied primarily on crude ads and fake accounts."
The Chinese network has little to no traction because it was far less developed and smaller. It had 92 accounts, pages, and groups on Facebook and Instagram, and they amassed a total of about 280 followers.
This is the first time Meta has eliminated a Chinese attempt targeted at the U.S. midterm elections. Fake accounts posted about U.S. politics, politicians like President Joe Biden, Sen. Marco Rubio (R-Fla.), and Florida's Republican governor, Ron DeSantis, as well as controversial subjects like abortion access and gun rights during the spring and summer of this year. The accounts first pretended to be conservative Americans before switching to liberal ones.
Both network disruptions were revealed in a report released on Tuesday detailing Meta's efforts to combat what it refers to as "coordinated inauthentic behavior." These types of activities, according to Meta, amount to "coordinated efforts to manipulate public debate for a strategic goal, in which fake accounts are central to the operation."
While the campaigns were unrelated, the simultaneous takedowns highlight how social media platforms remain prime targets for efforts to change narratives around high-profile events, according to Ben Nimmo, Meta's global threat intelligence head.
Facebook constantly looks for and eliminates accounts it thinks have transgressed the rules against coordinated fraudulent behavior. After the 2016 presidential election, when intelligence agencies discovered that Russian groups had exploited social media platforms to spread divisive narratives in the U.S., such conduct became a flashpoint.