- By Grigor Atanesian
- Global Disinformation Team
Hundreds of Russian and Chinese state propaganda accounts are thriving on Twitter after Elon Musk wiped out the team that fought those networks, the BBC has found.
The unit worked to counter “information operations,” campaigns coordinated by countries like Russia, China and Iran, intended to sway public opinion and disrupt democracy.
But experts and former employees say the majority of those specialists have quit or been fired, leaving the platform vulnerable to outside manipulation. The BBC spoke to several of them. They requested anonymity, citing non-disclosure agreements and threats received online.
“The whole human layer has been wiped out. All that’s left on Twitter are automated detection systems,” said a former senior executive.
In a BBC interview on Tuesday, Musk claimed there was “less misinformation [on Twitter] rather than more” under his tenure. He did not comment on the state troll farms active on the platform or the team fighting them.
We approached Twitter for a comment but received no response other than a poo emoji – the company’s standard automatic response to any press request.
Organized groups of people who post coordinated messages are called “troll farms.” The term was first used by Russian journalists who denounced one of about 300 paid employees led by Yevgeny Prigozhin, leader of the Wagner mercenary group.
Since then, troll farms influencing elections and public opinion have been uncovered in many countries, from Poland and Turkey to Brazil and Mexico. They have also been used as a propaganda tool in ethnic conflicts and wars.
Now a new group of Russian trolls is active on Twitter.
He supports Putin’s war in Ukraine, ridicules kyiv and the West, and attacks independent Russian-language publications, including the BBC Russian Service. Many of these trolls’ accounts have been suspended, but dozens are still active.
Darren Linvill, an associate professor at Clemson University’s Media Forensics Hub in South Carolina, says the network appears to originate from the Prigozhin troll factory.
Linvill and his colleagues also uncovered two similar Russian-language troll rings, but from opposite sides. One tweets in favor of Ukraine, and another promotes the Russian opposition, including imprisoned Putin critic Alexey Navalny.
Although they have all the markings of troll accounts, including random numbers in Twitter handles and coordinated behavior, these networks seem to go undetected by the platform.
The Clemson University team also tracks pro-China accounts targeting Chinese and English language users on issues important to the Chinese government.
With only a skeletal team remaining, Twitter lacks the resources to quickly detect, attribute and eliminate this foreign propaganda, according to former employees.
While the platform has also partnered with research institutions that have detected information operations, scholars say they haven’t heard anything from Twitter since November.
Punch above his weight
Experts have long warned of the dangers of foreign influence on social media.
In 2018, the FBI said fake accounts posing as real Americans played a central role in Russian efforts to interfere in the 2016 election. That’s when Twitter and Facebook began to hire “information operations” specialists.
“I still remember the rage I felt when I saw accounts with names like ‘Pamela Moore’ and ‘Crystal Johnson’ claiming to be real Americans from Wisconsin and New York, but with numbers from phone dating back to St. Petersburg, Russia,” recalls Yoel Roth, former head of trust and safety at Twitter.
Twitter has a fraction of the reach and budget of Facebook. But over the years he has built a small but capable team. If it could not compete with the resources of its rival social network, Twitter “still exceeded its weight,” said Lee Foster, independent expert in information operations.
Twitter hired people with backgrounds in cybersecurity, journalism, government agencies and NGOs who spoke a range of languages, including Russian, Farsi, Mandarin, Cantonese, Spanish and Portuguese.
A former investigator says: “We needed people who could understand: if Russia is likely to be the responsible actor behind this, what is their motivation for doing this particular operation?
He says he quit because his team didn’t fit the “Twitter 2.0” that Musk was building.
“Our role was to help make using Twitter as safe as possible. And it didn’t look like that was going to continue as a priority.”
Countering propaganda around the world
The team worked in close contact, but separately from those who fought misinformation. Indeed, state-run campaigns can use both fake news and factual stories to promote their messages.
In 2016, Russian trolls targeted black voters in the United States using real footage showing police brutality. And in 2022, a coordinated network promoted negative – but sometimes accurate – information about the French contingent and United Nations missions in Africa’s Sahel region.
Both networks have been removed by Twitter.
As similar information operations were conducted on different platforms, Twitter employees met with peers from Meta and other companies to exchange information.
But at such meetings, Twitter investigators would recall how small their operation was. “Their team would be ten times the size of ours,” says an investigator.
Now even these resources are lacking.
Without the team dedicated to fighting coordinated campaigns, Twitter “will slowly become more and more dangerous,” says Linvill of Clemson University.
With additional reporting by Hannah Gelbart.
#Twitter #downsizing #leaves #Russian #trolls #unchecked #BBC #News #BBC #Homepage