After that, she started using the popular short-form video app to educate her followers about various digital dangers. She posted about the risks of being approached by strangers online and the problematic content found hidden in the deep corners of TikTok and other platforms.
“Minors started tagging me in posts or sharing things they’d come across that made them scared,” Adair, who now has 350,000 TikTok followers, told CNN Business. “They’d say, ‘Hey, I came across this. What do I do? Can you do something?'”
In one example, a young follower informed Adair of an alleged practice on TikTok in which minors and adults can post explicit videos privately on their accounts’ “Only Me” feed, where it can then be accessed by anyone with a shared password. (A TikTok spokesperson told CNN Business it moderates each video uploaded to the site, whether set to private or public, and removes and reports any child sexual abuse material to the National Center for Missing or Exploited Children.)
Adair is part of an emerging community of so-called “watchdog moms” calling attention to potential issues for younger users on TikTok and other platforms, and building up an online following in the process. This small group of parents flag issues ranging from the sharing of exploitative videos to more routine concerns about the oversharing of photos and personal information about children online. Adair, along with these peers, work to get problematic content taken offline, which she said is often a “very long battle.”
“My main goal was just to share a different perspective so parents would reflect on their own sharing practices and maybe think, ‘I never thought about it like that. Why am I sharing my child publicly?'” she said.
This subset of influencers never meet in person but chat often to share findings and experiences, and come together to attempt to take down concerning videos, according to Adams. The digital movement, which comes amid heightened scrutiny of the impact social media platforms have on younger users, also highlights the challenges tech companies face in effectively policing problematic content.
Last year, executives from TikTok, Snap and Meta testified before a Senate subcommittee as lawmakers questioned their apps’ impact on the mental health of teens. At the time, TikTok’s VP and head of public policy, Michael Beckerman, said the company is working to “keep its platform safe and create age appropriate experiences” but added “we do know trust must be earned.”
“We’re seeking to earn trust through a higher level of action, transparency and accountability, as well as the humility, to learn and improve,” Beckerman testified. He cited a handful of parental controls, improvements with moderation and age restrictions such as no direct messaging for anyone under age 16, as ways TikTok protects its young userbase. He also encouraged parents to get on the platform to learn more about how it works and how it makes their children feel.
A company spokesperson told CNN Business TikTok encourages members of its community to report content that may be in violation. But Adair said she believes the onus has fallen too much on herself and other advocates on TikTok to step up in areas where the site is failing.
“It should not be our responsibility as creators on the app to do this, but TikTok is not taking care of their part of it,” Adair said. “As moms, we feel the responsibility to warn other parents so they can take responsibility for their own children and prepare them for what they could face on these apps.”
Hitting ‘report’ for hours
Like other large tech platforms, TikTok relies on users and algorithmic systems to flag potentially violative content, which then gets reviewed by a human moderator to determine whether it should be removed. The company said it has thousands of safety experts, which includes child safety professionals, moderators and policy experts, across the world and three main hubs — Singapore, Dublin and Mountain View, California — with workers to take down videos in real time that may violate its policies.
But as with other platforms, TikTok gets criticized for what it does and does not take action on. Over the years, numerous researchers, journalists and everyday users have flagged posts and accounts to tech companies for seemingly running afoul of various platform policies. The “watchdog moms” fit into this long history, but with a focus primarily on concerning content and activity for social media’s youngest users, including on TikTok, which is widely popular with teenagers.
Carly Yoost, CEO and founder of the Child Rescue Coalition, called the emergence of parent influencers who warn about exploitation methods on social media an “important” movement to ensure both parents and children are educated on digital safety. Gabrielle Usatynski, a family therapist who focuses on social media’s impact on families, also applauded their efforts to bring attention to violative content.
But Michela Menting, a research director at global technology intelligence firm ABI Research, said it is “quite concerning” that parents have to take on this role. “It shows a total lack of regulation engaging the liability of social media providers and digital platforms to police exploitative content,” she said. “It shows a willful ignorance of the very well-known dangers of the cyber-world for minors.”
These factors enable problematic content to endure on the site, Menting added.
TikTok spokesperson said explicit nudity is a breach of community guidelines.
Reay, who said she experienced sexual abuse as a child for over a decade by a family member, uses the platform to share warning signs of sexual abuse and offer advice to her 1 million followers on how to help others or themselves. She also works alongside non-profits who have safe houses for sexual exploitation survivors and organizations, such as Rescue America and the National Center on Sexual Exploitation, to bring awareness to the issue.
“I will spend a whole 7-hour day just hitting ‘report’ [on a post] in between making sandwiches for my kids and doing [other things],” she told CNN Business. “I’ll call on my armies of thousands of followers to help me report, and it will take 24 to 48 hours to be taken down. Some posts were posted two years before I saw them, so I often wonder how many people were sexually exploited in that timeframe.”
Although Reay sees some success with helping to take down content, she said the vast majority of times she reports a video to TikTok it comes back with no violation. Adams said she also reports “a very high number” of problematic accounts and posts that come back with no violations.
“The Community Guidelines on TikTok currently feel more like a ‘suggestion’ rather than hard rules that will be enforced by the platform,” she said.
Watching the watchdogs
While speaking out on these issues has helped broaden their followings on TikTok, it also appears to come with some headaches. Multiple parents said they believe TikTok has taken action against their accounts.
Reay, for example, said she believes some of her videos get less preferred placement on the platform, so they’re seen by fewer people. Other videos are taken down for violating community guidelines, she said.
She added that when she puts the attention on the creators who post or make the inappropriate content, her own account will often be temporarily suspended for bullying or flagged for minor safety. “It really inhibits the work that I do when I get put on suspensions and they’re usually about a week long,” she said.
A TikTok spokesperson said accounts or videos will be removed if they have violations themselves, such as bullying, and every user has the right to appeal the decision if they felt it was incorrect.
Adair said her account has been removed from TikTok on various occasions, too. She added that part of the reason she started talking about the dangers of TikTok was to get the company’s attention. Although she said it has partially worked, she believes TikTok still has a lot of work to do to make the site a safer place for young users.
“For a company of this size, there are going to be faults, but when it comes to child exploitation and grooming, these are serious things that [shouldn’t slip through any cracks],” she said.