“Hello, my name is James Bulger,” says the image in one TikTok video, made in the likeness of the British 2-year-old who was abducted in 1993 as his mother paid for groceries.
“If my mom turned right, I could have been alive today. Unfortunately, she turned left,” the childlike voice says, citing what James’s mother once said was one of her biggest regrets: If she had turned right, she would have seen her son being led away by the two 10-year-olds who later tortured and killed him.
While TikTok has removed multiple videos depicting James narrating his abduction and death in Kirkby, England, many remain available to view on YouTube.
James’s mother, Denise Fergus, told the Daily Mirror that the posts were “disgusting.”
“To use the face and a moving mouth of a child who is no longer here, who has been brutally taken away from us, there are no words,” she told the newspaper.
A TikTok spokesperson described the content about James as “disturbing,” saying there “is no place” on the platform for such posts. “Our Community Guidelines are clear that we do not allow synthetic media that contains the likeness of a young person,” Barney Hooper said in an email to The Washington Post. “We continue to remove content of this nature as we find it.”
TikTok’s guidelines say the advancement of AI “can make it more difficult to distinguish between fact and fiction, carrying both societal and individual risks,” and ask that users who share synthetic or manipulated media showing realistic scenes include a disclaimer noting that the content is not real. YouTube did not immediately return a request for comment about its guidelines on Monday.
Felix M. Simon, a communication researcher at the Oxford Internet Institute, said he was confident that the videos mentioned in this piece were produced using “one or several AI tools,” but could not say which software exactly.
“They appear to be created with some form of AI tools and bear some of the typical hallmarks of cheaper AI-generated videos,” such as an anime or comic-like aesthetic and polished skin, he said in an email.
According to the Mirror, Stuart Fergus, the husband of James Bulger’s mother, said that after he reached out to one creator asking them to take down their video, he received a reply saying: “We do not intend to offend anyone. We only do these videos to make sure incidents will never happen again to anyone. Please continue to support and share my page to spread awareness.”
Despite TikTok’s attempts to remove such videos, many can still be found on the platform, some of which have generated millions of views.
“My name is Madeleine Beth McCann,” says a childlike voice in another video, showing an image of the British 3-year-old who disappeared from a Portuguese resort in 2007. “I’m still missing.” That TikTok video, which The Post reviewed, had been viewed tens of thousands times before TikTok removed it. The owner of the account had written alongside the video that it was an attempt at “immersive storytelling.”
Another video shows the likeness of Anne Frank advertising baby clothes before discussing the horrors of the Holocaust. “Are you still looking for beautiful and self-designed baby clothes? Then go to the link in my bio and let yourself be surprised,” the young girl with brown shoulder-length hair “says” in German. “And now I’ll tell you the story of Anne Frank.”
In another video from the same account, also in German, a boy with blond hair and a scarred face emerges to tell the story of Peter Connelly, also known as “Baby P” in Britain. The 17-month-old died in 2007 following months of physical abuse from his mother and her boyfriend.
The video about Peter’s short life comes with a trigger warning that it may cause distress but does not include a disclaimer that the video was altered or AI generated, as is required by the platform. The owner of the account declined to comment when approached by The Post.
Another video on the platform, from a different account, tells the story of Gabriel Fernández, the 8-year-old boy from California who was fatally tortured by his mother and her boyfriend. He died in 2013. “When I started attending a new school, the teacher reported to social services that I had bruises on my body. I asked her if it was normal to bleed when my mother hit me with a belt,” the image of a boy purporting to be Gabriel says in the video, which has been viewed more than 25 million times on TikTok.
Simon, the researcher, said the production of such videos is becoming easier due to the increasing functionalities of AI and applications becoming cheaper and easier for people to use.
While AI advances have allowed software that can imitate, or even clone, people’s voices with precision, the childlike voices in these videos appear to be computer-generated and not based on the victims’ real voices. The voice in one of the videos of Madeleine viewed by The Post, for example, has an American accent.
Simon cautioned that the videos — which are often accompanied by dramatic or sorrowful music, or show children with scars and bloodied faces — “have the potential to re-traumatize the bereaved.”
“It is a widely held belief in many societies and cultures that the deceased should be treated with dignity and have certain rights,” he said. “Such videos strike me as possible violations of this principle in two ways. Firstly, they appropriate the personality of the deceased, disregarding or defying their likely wishes. Secondly, they may infringe upon what the deceased’s relatives perceive as their dignity.”
On social media, emotional content tends to lead to higher engagement, which can increase the possibility of content being shared, research has found. In the comments section of these videos, many viewers leave tributes to the young victims, while others criticize the creators for using these victims for clout.
Cory Bradford, a TikToker who has gained almost 1 million followers producing history videos, said that while he generally avoids using AI in his own posts, those who do are likely trying to boost engagement, especially on a platform where the audience skews younger. “I suspect those using AI to make these recreations are doing so for shock and awe to generate high view counts,” he said.
Hany Farid, a professor of digital forensics at the University of California at Berkeley, also noted that manipulated media is driving user engagement likely because it is “morbid and sensational.” Farid said the images of the children fall into “the general category of deepfakes, now more commonly being called generative AI.”
There is “definitely” an appetite for historical videos on TikTok and certainly a place for AI to tell these stories, Bradford said.
He said the technology to recreate realistic historical images is not “quite there” yet, adding: “I think as long as the AI is that unrealistic there are going to be people that say this is just way too creepy.”
But “it’s getting there,” Bradford said. “And it will be there soon. And we all have to deal with the consequences of what that’s going to lead to.”