By Paul Myers, Olga Robinson, Shayan Sardarizadeh and Mike Wendling, BBC Verify and BBC News
A network of Russia-based websites masquerading as local American newspapers is pumping out fake stories as part of an AI-powered operation that is increasingly targeting the US election.
A former Florida police officer who relocated to Moscow is one of the key figures behind it, a BBC investigation can reveal.
It would have been a bombshell report – if it was true.
Olena Zelenska, the first lady of Ukraine, allegedly bought a rare Bugatti Tourbillon sports car for 4.5m euros ($4.8m; £3.8m) while visiting Paris for D-Day commemorations in June. The source of the funds was supposedly American military aid money.
The story appeared on an obscure French website just days ago – and was swiftly debunked.
Experts pointed out strange anomalies on the invoice posted online. A whistleblower cited in the story appeared only in an oddly edited video that may have been artificially created. Bugatti issued a sharp denial, calling it “fake news”, and its Paris dealership threatened legal action against the people behind the false story.
But before the truth could even get its shoes on, the lie had gone viral. Influencers had already picked up the false story and spread it widely.
One X user, the pro-Russia, pro-Donald Trump activist Jackson Hinkle, posted a link seen by more than 6.5m people. Several other accounts spread the story to millions more X users – at least 12m in total, according to the site’s metrics.
It was a fake story, on a fake news website, designed to spread widely online, with its origins in a Russia-based disinformation operation BBC Verify first revealed last year – at which point the operation appeared to be trying to undermine Ukraine’s government.
Our latest investigation, carried out over more than six months and involving the examination of hundreds of articles across dozens of websites, found that the operation has a new target – American voters.
Dozens of bogus stories tracked by the BBC appear aimed at influencing US voters and sowing distrust ahead of November’s election. Some have been roundly ignored but others have been shared by influencers and members of the US Congress.
The story of the Bugatti hit many of the top themes of the operation – Ukrainian corruption, US aid spending, and the inner workings of French high society.
Another fake which went viral earlier this year was more directly aimed at American politics.
It was published on a website called The Houston Post – one of dozens of sites with American-sounding names which are in reality run from Moscow – and alleged that the FBI illegally wiretapped Donald Trump’s Florida resort.
It played neatly into Trump’s allegations that the legal system is unfairly stacked against him, that there is a conspiracy to thwart his campaign, and that his opponents are using dirty tricks to undermine him. Mr Trump himself has accused the FBI of snooping on his conversations.
Experts say that the operation is just one part of a much larger ongoing effort, led from Moscow, to spread disinformation during the US election campaign.
While no hard evidence has emerged that these particular fake news websites are run by the Russian state, researchers say the scale and sophistication of the operation is broadly similar to previous Kremlin-backed efforts to spread disinformation in the West.
“Russia will be involved in the US 2024 election, as will others,” said Chris Krebs, who as the director of the US Cybersecurity and Infrastructure Security Agency was responsible for ensuring the integrity of the 2020 presidential election.
“We’re already seeing them – from a broader information operations perspective on social media and elsewhere – enter the fray, pushing against already contentious points in US politics,” he said.
The BBC contacted the Russian Foreign Ministry and Russia’s US and UK embassies, but received no response. We also attempted to contact Mr Hinkle for comment.
How the fakes spread
Since state-backed disinformation campaigns and money-making “fake news” operations attracted attention during the 2016 US election campaign, disinformation merchants have had to get more creative both in spreading their content and making it seem credible.
The operation investigated by BBC Verify uses artificial intelligence to generate thousands of news articles, posted to dozens of sites with names meant to sound quintessentially American – Houston Post, Chicago Crier, Boston Times, DC Weekly and others. Some use the names of real newspapers that went out of business years or decades ago.
Most of the stories on these sites are not outright fakes. Instead, they are based on real news stories from other sites apparently rewritten by artificial intelligence software.
In some instances, instructions to the AI engines were visible on the finished stories, such as: “Please rewrite this article taking a conservative stance”.
The stories are attributed to hundreds of fake journalists with made-up names and in some cases, profile pictures taken from elsewhere on the internet.
For instance, a photo of best-selling writer Judy Batalion was used on multiple stories on a website called DC Weekly, “written” by an online persona called “Jessica Devlin”.
“I was totally confused,” Ms Batalion told the BBC. “I still don’t really understand what my photo was doing on this website.”
Ms Batalion said she assumed the photo had been copied and pasted from her LinkedIn profile.
“I had no contact with this website,” she said. “It’s made me more self-conscious about the fact that any photo of yourself online can be used by someone else.”
The sheer number of stories – thousands each week – along with their repetition across different websites, indicates that the process of posting AI-generated content is automated. Casual browsers could easily come away with the impression that the sites are thriving sources of legitimate news about politics and hot-button social issues.
However, interspersed within this tsunami of content is the real meat of the operation – fake stories aimed increasingly at American audiences.
The stories often blend American and Ukrainian political issues – for instance one claimed that a worker for a Ukrainian propaganda outfit was dismayed to find that she was assigned tasks designed to knock down Donald Trump and bolster President Biden.
Another report invented a New York shopping trip made by Ukraine’s first lady, and alleged she was racist towards staff at a jewellery store.
The BBC has found that forged documents and fake YouTube videos were used to bolster both false stories.
Some of the fakes break out and get high rates of engagement on social media, said Clement Briens, senior threat intelligence analyst at cybersecurity company Recorded Future.
His company says that 120 websites were registered by the operation – which it calls CopyCop – over just three days in May. And the network is just one of a number of Russia-based disinformation operations.
Other experts – at Microsoft, Clemson University, and at Newsguard, a company that tracks misinformation sites – have also been tracking the network. Newsguard says it has counted at least 170 sites connected to the operation.
“Initially, the operation seemed small,” said McKenzie Sadeghi, Newsguard’s AI and foreign influence editor. “As each week passed it seemed to be growing significantly in terms of size and reach. People in Russia would regularly cite and boost these narratives, via Russian state TV, Kremlin officials and Kremlin influencers.
“There’s about a new narrative originating from this network almost every week or two,” she said.
Making the fake appear real
To further bolster the credibility of the fake stories, operatives create YouTube videos, often featuring people who claim to be “whistleblowers” or “independent journalists”.
In some cases the videos are narrated by actors – in others it appears they are AI-generated voices.
Several of the videos appear to be shot against a similar-looking background, further suggesting a co-ordinated effort to spread fake news stories.
The videos aren’t themselves meant to go viral, and have very few views on YouTube. Instead, the videos are quoted as “sources” and cited in text stories on the fake newspaper websites.
For instance, the story about the Ukrainian information operation allegedly targeting the Trump campaign cited a YouTube video which purported to include shots from an office in Kyiv, where fake campaign posters were visible on the walls.
Links to the stories are then posted on Telegram channels and other social media accounts.
Eventually, the sensational “scoops” – which, like the Trump wiretap story and a slew of earlier stories about Ukrainian corruption, often repeat themes already popular among patriotic Russians and some supporters of Donald Trump – can reach both Russian influencers and audiences in the West.
Although only a few rise to the highest levels of prominence, some have spread to millions – and to powerful people.
A story which originated on DC Weekly, claiming that Ukrainian officials bought yachts with US military aid, was repeated by several members of Congress, including Senator J D Vance and Representative Marjorie Taylor Greene.
Mr Vance is one of a handful of politicians mentioned as a potential vice-presidential running mate for Donald Trump.
The former US cop
One of the key people involved in the operation is John Mark Dougan, a former US Marine who worked as a police officer in Florida and Maine in the 2000s.
Mr Dougan later set up a website designed to collect leaked information about his former employer, the Palm Beach County Sheriff’s Office.
In a harbinger of his activities in Russia, Mr Dougan’s site published authentic information including the home addresses of police officers, alongside fake stories and rumours. The FBI raided his apartment in 2016, at which point he fled to Moscow.
He has since written books, reported from occupied parts of Ukraine and has made appearances on Russian think tank panels, at military events and on a TV station owned by Russia’s ministry of defence.
In text message conversations with the BBC, Mr Dougan has flatly denied being involved with the websites. On Tuesday, he denied any knowledge of the story about the Bugatti sports car.
But at other times he has bragged about his prowess in spreading fake news.
At one point he also implied that his activities are a form of revenge against American authorities.
“For me it’s a game,” he said. “And a little payback.”
At another point he said: “My YouTube channel received many strikes for misinformation” for his reporting from Ukraine, raising the prospect of his channel being taken offline.
“So if they want to say misinformation, well, let’s do it right,” he texted.
A large body of digital evidence also shows connections between the former police officer and the Russia-based websites.
The BBC and experts we consulted traced IP addresses and other digital information back to websites run by Dougan.
At one point a story on the DC Weekly site, written in response to a New York Times piece which mentioned Dougan, was attributed to “An American Citizen, the owner of these sites,” and stated: “I am the owner, an American citizen, a US military veteran, born and raised in the United States.”
The article signed off with Dougan’s email address.
Shortly after we reported on Mr Dougan’s activities in a previous story, a fake version of the BBC website briefly appeared online. It was linked through digital markers to his network.
Mr Dougan is most likely not the only person working on the influence operation and who funds it remains unclear.
“I think it’s important not to overplay his role in this campaign,” said Darren Linvill, co-director of Clemson University’s Media Forensic Hub, which has been tracking the network. “He may be just a bit of a bit player and a useful dupe, because he’s an American.”
Despite his appearances on state-run media and at government-linked think tanks, Mr Dougan denies he is being paid by the Kremlin.
“I have never been paid a single dime by the Russian government,” he said via text message.
Targeting the US election
The operation that Dougan is involved in has increasingly shifted its focus from stories about the war in Ukraine to stories about American and British politics.
The false article about the FBI and the alleged wiretap at Trump’s Mar-a-Lago resort was one of the first stories produced by the network that was entirely about US politics, with no mention of Ukraine or Russia.
Clint Watts, who leads Microsoft’s Digital Threat Analysis Center, said that the operation often blends together issues with salience both in Ukraine and the West.
Mr Watts said that the volume of content being posted and the increasing sophistication of Russia-based efforts could potentially pose a significant problem in the run-up to November’s election.
“They’re not getting mass distribution every single time,” he said, but noted that several attempts made each week could lead to false narratives taking hold in the “information ocean” of a major election campaign.
“It can have an outsized impact”, and stories from the network can take off very quickly, he said.
“Gone are the days of Russia purchasing ads in roubles, or having pretty obvious trolls that are sitting in a factory in St. Petersburg,” said Nina Jankowicz, head of the American Sunlight Project, a non-profit organisation attempting to combat the spread of disinformation.
Ms Jankowicz was briefly director of the short-lived US Disinformation Governance Board, a branch of the Department of Homeland Security designed to tackle false information.
“Now we’re seeing a lot more information laundering,” she said – using a term referring to the recycling of fake or misleading stories into the mainstream in order to obscure their ultimate source.
Where it goes next
Microsoft researchers also say the operation is attempting to spread stories about UK politics – with an eye on Thursday’s general election – and the Paris Olympics.
One fake story – which appeared on the website called the London Crier – claimed that Mr Zelensky bought a mansion owned by King Charles III at a bargain price.
It was seen by hundreds of thousands of users on X, and shared by an official Russian embassy account. YouTube removed an AI-narrated video posted by an obscure channel that was used as the source of the false story after it was flagged by BBC Verify.
And Mr Dougan hinted at even bigger plans when asked whether increased attention on his activities would slow the spread of his false stories.
“Don’t worry,” he said, “the game is being upped.”