I would know — I’m Dave, sometimes known as The Washington Post’s “TikTok guy.” I started the account in May 2019, and it quickly morphed into a full-time job during the pandemic as our account grew to 1.5 million followers and more people turned to the platform for news, entertainment and often both. According to a recent study by the Pew Research Center, more than a quarter of U.S. adults under 30 regularly get their news from TikTok. Over time, our team expanded and we hired Chris and Carmella. Even as our overall viewership has grown, everyone on the team, at one point or another, has created a TikTok that the algorithm appears to have just completely squashed.
Technically, you never know for sure if you are being suppressed. TikTok’s community guidelines state, “Our algorithms are designed with trust and safety in mind. For some content, we may reduce discoverability, including by redirecting search results, or making videos ineligible for recommendation in the ‘For You’ feed.” But when Chris joined the team in 2021, he began reporting on suppression — and his series got little-to-no traction.
When you’re suppressed, most or all of the views come from the “Following” or “Personal Profile” page, not the “For You” page. If you use sensitive hashtags that TikTok doesn’t like, you can’t even click through to them. It takes you to a page with zero results. Your traffic is suddenly a fraction of what it used to be. Creators with hundreds of thousands of views suddenly have as few as 400.
Were Chris’s TikToks officially suppressed? There’s no clear way of knowing. Again, TikTok rarely offers any explanations.
Recently, to supplement The Post’s coverage of TikTok, we decided to actively try to get suppressed. Basically, we wanted to get TikTok to punish The Post’s TikTok account and keep viewers from watching it by posting content that we’ve found the app suppresses.
The plan was to make two TikToks. The first would be the “control.” (If it’s starting to sound like a fourth-grade science project, with a hypothesis, control and variable, then you’re catching on.) The experiment was overly simplistic, but in my defense, I never really grasped science later in my education.
In our “control” TikTok, we explained the experiment but stopped short of saying or doing anything that might get us suppressed — Chris even holds up a sign that says “$uppressed.” (It’s common to intentionally misspell or change words on TikTok to avoid suppression. Creators will say “unalive” instead of “dead” or “seggs” instead of “sex.”) We also told people they may not see our second TikTok and to look for it in our profile. Remember: One possible sign of suppression is that the TikTok doesn’t show up on anyone’s “For You” page.
The second TikTok was shot the same way, but I put on a jacket. The consensus is that uploading a completely identical video is hurt by the algorithm. For instance, every news TikTok account (including us) uploaded an identical clip of Biden getting inaugurated and it’s possible this worked against everyone. TikTok Two had a very similar script, but we didn’t shy away from saying “suppressed.” It also ended with us saying as many potentially suppressed words as possible. In the past, TikTok has admitted to suppressing some of the words or phrases we used.
Our experiment did not work. The supposedly “suppressed” TikTok gained views faster than any TikTok we’ve ever posted, and we’ve made nearly 2,000. At nearly 25,000 comments, it’s also our most-commented TikTok ever. It failed miserably.
Many commenters pointed out that the experiment was flawed. By spacing them out, our intention was to make sure enough people saw the first video. We needed those followers to report back to us on where they found the second video — or if they were seeing it at all. But in the four hours between the two videos, TikTok may have flagged to moderators not to suppress the follow-up video. If TikToks really are moderated by humans, it would have been very easy for a human to watch this and make the decision not to suppress it. The “suppressed” words we used also had no bite or meaning.
At 3:30 p.m., when the second TikTok was posted, more than 100,000 people had viewed the first TikTok. Within 10 minutes, there were hundreds of comments from those loyal soldiers, saying they did indeed find the video on our profile. Shortly after, nearly every commenter said, “FYP,” meaning “For You” Page. Some viewers noted that as a verified news account, it’s possible we have more leeway. Though again, we’ve experienced very low traffic following certain videos — specifically TikToks about TikTok.
Not unlike every science class I took in high school, the experiment we concocted underperformed. But from a journalistic perspective, it’s fair to say this brought attention to the issue both inside and outside the app. Given the 25,000 people who dutifully commented on the second TikTok, the app’s users are clearly invested in this very important issue.
As a growing news source (and sometimes the only news source for Americans), the lack of transparency in TikTok’s suppression techniques should be scrutinized as much as possible. If we try to get suppressed again in the future, we will have to be far sneakier. So, you’ll just have to follow The Post’s TikTok account to see what’s next.
Editing by Karly Domb Sadof. Additional editing by Monique Woo and Dave Jorgenson.