His friends replied that it wasn’t just him. They too were receiving violent videos in their feed. Twitter users also began posting about the phenomenon. “Hey @instagram,” one Twitter user posted in September, “why was the first thing on my feed today a beheading video from an account i don’t even follow? Thx!” Mitchell, an Instagram user in his early 20s who asked to be referred to solely by his first name because of security concerns, said that “It started with a video of a car crash, or an animal getting hit by a train. I just scrolled past it. Then I started to see people get shot.”
Since Instagram launched Reels, the platform’s TikTok competitor, in 2020, it has taken aggressive steps to grow the feature. It rewarded accounts that posted Reels videos with increased views and began paying monthly bonuses to creators whose Reels content performed well on the app.
Instagram also announced last year it would be leaning harder into algorithmic recommendation of content. On Meta’s second-quarter earnings call, CEO Mark Zuckerberg noted that Reels videos accounted for 20 percent of the time people spent on Instagram, saying that Reels engagement was “growing quickly” and that the company saw a 30 percent increase in the amount of time people spent engaging with Reels.
But at least part of that engagement has come from the kinds of videos Reinman and other users have raised concerns about, a result that shows how Meta’s Instagram has failed to contain harmful content on its platform as it seeks to regain audience lost to TikTok.
A Meta spokesperson said that the company was conducting a review of the content in question, adding that the platform removes millions of offensive videos and takes other steps to try to limit who can see them. “This content is not eligible to be recommended and we remove content that breaks our rules,” the spokesperson said in statement. “This is an adversarial space so we’re always proactively monitoring and improving how we prevent bad actors from using new tactics to avoid detection and evade our enforcement.”
Meme pages are some of Instagram’s most popular destinations, amassing millions of followers by posting videos, photos and memes designed to make viewers laugh or feel a connection. They account for tens of millions of Instagram followers, and their audiences often skew very young — according to a survey from marketing firm YPulse, 43 percent of 13- to 17-year-olds follow a meme account, an age group whose safety online is one of the few things Democrats and Republicans in Congress agree on. To add to the concern, the majority of people running the accounts are young, often teenagers themselves, those in the meme community say.
While the majority of meme pages don’t engage in such tactics, a sprawling underbelly of accounts competing for views have begun posting increasingly violent content.
The videos are truly horrific. In one video, a bloody pig is fed into a meat grinder. It amassed over 223,000 views. Other Reels videos that amassed tens of thousands of views show a woman about to be beheaded with a knife, a man being strung up in a basement and tortured, a woman being sexually assaulted. Several videos show men getting run over by cars and trains, and dozens show people getting shot. Other Reels videos contain footage of animals being shot, beaten and dismembered.
“#WATCH: 16-year-old girl beaten and burned to death by vigilante mob” the caption on one video reads, showing a bloody young woman being beaten and burned alive. The video was shared to an Instagram meme page with over 567,000 followers.
One day last week, four large meme pages, two with over 1 million followers, posted a video of a young child being shot in the head. The video amassed over 83,000 views in under three hours on just one of those pages (the analytics for the other three pages weren’t available). “Opened Insta up and boom first post wtf,” one user commented.
Large meme accounts post the graphic content to Reels in an effort to boost engagement, meme administrators and marketers said. They then monetize that engagement by selling sponsored posts, primarily to agencies that promote OnlyFans models. The higher a meme page’s engagement rate, the more it can charge for such posts. These efforts have escalated in recent months as marketers pour more money into meme pages in an effort to reach a young, highly engaged audience of teenagers, marketers said.
Sarah Roberts, an assistant professor at University of California, Los Angeles, specializing in social media and content moderation, said that while what the meme accounts are doing is unethical, ultimately Instagram has created this environment and must shoulder the blame for facilitating a toxic ecosystem.
“The buck has to stop with Instagram and Meta,” she said, referring to Instagram’s parent company. “Of course, the meme accounts are culpable, but what’s fundamentally culpable is an ecosystem that provides such fertile ground for these metrics to have such intrinsic economic value. … [W]ithout Instagram providing the framework, it wouldn’t enter into someone’s mind, ‘let’s put a rape video up because it boosts engagement.’ They’re willing to do anything to boost those numbers, and that should disturb everyone.”
Some meme pages create original content, but many primarily republish media from around the web. Meme pages like @thefatjewish and an account whose name is too profane to print were some of the most powerful early influencers on Instagram, building huge marketing businesses around their millions of followers.
In recent years, some successful meme pages have expanded to become media empires. IMGN Media, which operates several popular Instagram meme pages including @Daquan, which has over 16.3 million followers, raised $6 million in funding in 2018 to grow its business before being acquired by Warner Music Group in 2020 for just under $100 million. Doing Things Media, which owns a slate of viral meme pages, raised $21.5 million in venture capital funding earlier this year. None of these companies or the accounts they manage have posted violent videos of the nature discussed here.
More children are seeking to leverage the internet early for financial and social gain, so many meme account administrators are young. George Locke, 20, a college student who began running meme accounts at age 13, the youngest age at which Instagram permits a user to have an account, said he has never posted gore, but has seen many other young people turn to those methods.
“I’d say over 70 percent of meme accounts are [run by kids] under the age of 18,” he said. “Usually when you start a meme account, you’re in middle school, maybe a freshman in high school. That’s the main demographic for meme pages, those younger teens. It’s super easy to get into, especially with the culture right now where it’s the grind and clout culture. There’s YouTube tutorials on it.”
Meta says it puts warning screens and age restrictions on disturbing content. “I don’t think there’s a world where all [meme pages and their followers] are 18-year-olds,” Locke said.
Jackson Weimer, 24, a meme creator in New York, said he began to notice more graphic content on meme pages last year, when Instagram began to push Reels content heavily in his Instagram feed. At first, meme pages were posting sexually explicit videos, he said. Then the videos became darker.
“Originally, these pages would use sexual content to grow,” he said, “but they soon transitioned to use gore content to grow their accounts even quicker. These gore Reels have very high engagement, there’s a lot of people commenting.”
Commenting on an Instagram video generates engagement. “People die on my page,” one user commented on a video posted by a meme page of a man and a woman simulating sex, hoping to draw viewers. Other comments below graphic videos promoted child porn groups on the messaging app Telegram.
In 2021, Weimer and 40 other meme creators reached out to the platform to complain about sexually explicit videos shared by meme pages, warning the platform that pages were posting increasingly violative content. “I am a little worried that some of your co-workers at Instagram aren’t fully grasping how huge and widespread of an issue this is,” Weimer said in an email to a representative from the company, which he shared with The Post.
Instagram declined to meet with the creators about their concerns. The content shared by many large pages has only become more graphic and violent. “If I opened Instagram right now, and scrolled for five seconds there’s a 50 per cent chance I’ll see a gore post from a meme account,” Weimer said. “It’s beheadings, children getting run over by cars. Videos of the most terrible things on the internet are being used by Instagram accounts to grow an audience and monetize that audience.”
A Meta spokesperson said that, since 2021, the company has rolled out a suite of controls and safety features for sensitive content, including demoting posts that contain nudity and sexual themes.
The rise in gore on Instagram appears to be organized. In Telegram chats viewed by The Post, the administrators for large meme accounts traded explicit material and coordinated with advertisers seeking to run ads on the pages posting graphic content. “Buying ads from nature/gore pages only,” read a post from one advertiser. “Buying gore & model ads!!” said another post by a user with the name BUYING ADS (#1 buyer), adding a moneybag emoji.
In one Telegram group with 7,300 members, viewed by The Post, the administrators of Instagram meme pages with millions of followers shared violent videos with each other. “Five Sinola [Sinaloa] cartel sicarios [hired killers] are beheaded on camera,” one user posted including the beheading video. “ … Follow the IG,” and included a link to his Instagram page.
Sam Betesh, an influencer marketing consultant, said that the primary way these sorts of meme accounts monetize is by selling sponsored posts to OnlyFans marketing agencies which act as middlemen between meme pages and OnlyFans models, who generate revenue by posting pornographic content behind a paywall to subscribers. An OnlyFans representative declined to comment but noted that these agencies are not directly affiliated with OnlyFans.
Meme accounts are fertile ground for this type of advertising because of their often young male audience. OnlyFans models’ advertising options are limited on the broader web because of the sexual nature of their services. The higher the meme page’s engagement rate is, the more the page can charge the OnlyFans agencies for ads.
“The only place you can put one dollar in and get three dollars out is Instagram meme accounts,” Betesh said. “These agencies are buying so many meme account promos they’re not doing due diligence on all the accounts.”
OnlyFans models whose images were promoted in advertisements on meme pages said they were unaware that ads with their image were being promoted alongside violent content. Nick Almonte, who runs an OnlyFans management company, said that he does not purchase ads from any accounts that post gore, but he has seen gore videos pop up in his Instagram feed.
“We’ve had [OnlyFans] girls come to us and say ‘Hey, these guys are doing these absurd things to advertise me, I don’t want to be involved with the type of people they’re associated with,’” Almonte said. “This happens on a weekly basis.”
Meme accounts are potentially raking in millions by posting the violence, said Liz Hagelthorn, a meme creator who formerly ran the largest meme network on Instagram, consisting of 127 pages and a collective 300 million followers. Hagelthorn said none of her pages ever posted violence. But young, often teenage, meme account administrators see gore as a way to cash in, she said.
“With gore, the more extreme the content is, is what the algorithm is optimizing for,” she said. “Overall what you see is when people hate the content or disagree with the content they’re spending 8 to 10 percent longer on the post and it’s performing 8 to 10 percent better.”
Some pages posting graphic violence are making over $2 million a year, she estimated. “The meme industry is an extension of the advertising and influencer industry,” she said, “and it is a very lucrative industry. If you have a million followers, you make at a base $3,000 to $5,000 per post. Bigger meme pages can make millions a year.”
“This is organized,” said Weimer. “It’s not two people posting gore videos, it’s hundreds of people in group chats coordinating posting and account growth.”
The administrators for several accounts posting gore appear to be young men, which Hagelthorn said is expected because most meme administrators are in their teens or early 20s. “These meme page audiences are 13-to 17- year olds, so the people who run the page are young,” Hagelthorn said.
Roberts, the assistant professor at UCLA, said that she worries about the effect this content and ecosystem is having on young people’s notions of morality.
“It seems like we’re raising a generation of adolescent grifters who will grow up having a totally skewed relationship of how to be ethical and make a living at the same time,” she said. “This is not normal and it’s not okay for young people to be exposed to it, much less be profiting from it.”