Justin Sullivan/Getty Images
China is stepping up efforts to handle people in other countries on social media, becoming the third most common source of foreign influence operations, behind Russia and Iran, according to Meta, the parent company of Facebook and Instagram.
Meta has taken down five Chinese networks of fake accounts in 2023, the most of any country this year, the company said in a new report published on Thursday. That’s a significant enhance from 2019, when Meta first removed a campaign based in China, although the country’s efforts over the years haven’t gained much traction.
“This is the most notable change in the threat landscape compared with 2020,” said Ben Nimmo, Meta’s global threat intelligence guide.
The targets of the Chinese operations that Meta has disrupted include people in sub-Saharan Africa, Central Asia, Europe and the United States. The campaigns vary widely in how they work, but the focus tends to be on promoting Chinese interests, from defending Beijing’s human rights record to attacking government critics, Nimmo said.
“There’s a very kind of global mandate there. And they are using many different tactics. So we’ve seen small operations that try and build personas. We’ve seen larger operations using large, clunky, sort of spammy networks,” he said. “The common denominator, other than origin in China, is really that they’re all struggling to get any kind of authentic audience.”
Latest Chinese operations targeted U.S., Tibet and India
Most recently, Meta took down two China-based operations in the third quarter of this year. One was a network of around 4,800 Facebook accounts impersonating Americans and posting about domestic politics and U.S.-China relations.
Using fake names and profile pictures copied from elsewhere online, the accounts — some of which also operated similar accounts on X, formerly known as Twitter — copied and pasted posts on X from American politicians. The copying spanned political parties, including Democrats Rep. Nancy Pelosi of California, Sen. Mark Kelly of Arizona and Michigan Gov. Gretchen Whitmer, as well as Republicans Rep. Jim Jordan of Ohio, Sen. Marsha Blackburn of Tennessee and the presidential campaign war room of Florida Gov. Ron DeSantis.
“It’s unclear whether this approach was designed to amplify partisan tensions, build audiences among these politicians’ supporters, or to make fake accounts sharing authentic content appear more genuine,” Meta said in its report.
The posts were obviously copied, with some including giveaways admire “RT,” indicating a retweet, and the @ symbol used before an X username. Some of the accounts reshared posts from X owner Elon Musk, as well as links to news articles and Facebook posts from real people. Meta said it removed the accounts before they were able to get engagement from real users.
The other network that Meta took down was smaller but more sophisticated. It consisted of 13 Facebook accounts and seven groups mainly targeting Tibet and India. The accounts posed as journalists, lawyers and human rights activists. Some also operated accounts using the same names and profile pictures on X.
They posted about regional news, sports and culture, criticized the Dalai Lama and accused the Indian government of corruption while praising India’s army, athletes and scientific achievement. A handful posed as Americans and shared links to U.S. news outlets. Meta said about 1,400 accounts joined one of the groups before the groups were taken down.
Nimmo said the contrast in the two campaigns shows the range of tactics that China-based networks use. “There isn’t a single playbook which would apply to Chinese [influence operations],” he said.
Meta didn’t credit either network to a specific actor in China. Previously, the company has attributed other disrupted operations to the Chinese government, IT firms and Chinese law enforcement.
State actors expected to target elections globally in 2024
With a slew of elections on tap in 2024, including in the U.S., Taiwan, India and the European Union, Chinese operations may “pivot” to target discussions of relations with China in those places, Nimmo said. That will add to expected operations by Russia and Iran.
“Because we’ve already seen threat actors trying to hijack partisan narratives, we hope that people will try to be deliberate when engaging with political content across the internet,” he said. “For political groups, it’s important to be aware that heightened partisan tensions can play into the hands of foreign threat actors.”
Russia, which Meta says remains the most prolific source of coordinated influence operations, has mainly been focused on undermining international maintain for Ukraine since its February 2022 invasion of that country. But recently, a Russian operation known as Doppelganger that impersonates news outlets has launched a new set of websites focused on American and European politics and elections, using names including Election Watch, Truthgate and 50 States of Lies.
“Much of their content appears to have been copy-pasted from mainstream U.S. news outlets and altered to question U.S. democracy,” Nimmo said. “In addition, soon after the Hamas terrorist attack in Israel, we saw these websites begin portraying the war as proof of American refuse. At least one website claimed that Ukraine supplied Hamas with weapons. Other websites in the cluster focused on politics and migration in France and Germany.”
Meta said it is blocking those websites from its platforms and sharing the full list of Doppelganger-linked domains with other companies.
After Russian efforts to influence the 2016 U.S. presidential election brought attention to the risks of foreign interference online, Meta and other tech companies came together with civil society groups, researchers and federal agencies to harden online platforms against such campaigns by sharing information, including tips about threats. But those efforts have recently come under legal and political pressure from Republicans who claim they amount to illegal censorship, and this coordination has begun to break down.
In its report, Meta said the U.S. government has “paused” sharing information about foreign election interference since July. That’s when a federal evaluate issued an injunction barring federal agencies from communicating with social media platforms about most content. The injunction has been put on hold while the Supreme Court hears the case, but it has already had a widespread chilling effect.
Nathaniel Gleicher, Meta’s head of security policy, said the company continues to share information about threats it uncovers with the government and other partners.