IE 11 is not supported. For an optimal experience visit our site on another browser.

QAnon videos are getting millions of views on TikTok as Trump embraces conspiracy theory

On TikTok, users have added emojis such as the American flag to QAnon hashtags like “#Trusttheplan” that were banned from the app starting in July 2020.
A QAnon conspiracy theory button sits affixed to the purse of an attendee of the Nebraska Election Integrity Forum on Aug. 27, 2022, in Omaha, Neb.
A QAnon conspiracy theory button on the purse of an attendee of the Nebraska Election Integrity Forum in Omaha on Aug. 27.Rebecca S. Gratz / AP file

QAnon conspiracy theory videos with thinly veiled hashtags are bringing in millions of views on TikTok ahead of the 2022 midterm elections.

In a review conducted by NBC News, users were found posting videos with emojis and slight wording variations in hashtags to evade QAnon hashtag bans, bringing attention to the conspiracy theory that helped fuel the Jan. 6 Capitol riot.

TikTok removed some of the videos, all of which were sent to the company in an NBC News inquiry, but many QAnon videos and hashtags continued to remain up on the platform.

The findings come as former President Donald Trump is ramping up his public support of the conspiracy theory. In recent weeks, Trump has used QAnon’s slogan on his social media app, Truth Social, and reposted messages from Q, the anonymous account that sometimes posts messages interpreted by QAnon followers. In a rally on Saturday, he spoke to a soundtrack that was identical to a song known as a QAnon theme song, and crowd members held up single fingers in response. QAnon’s slogan is “Where we go one we go all.” 

On TikTok, users have added emojis such as the American flag to QAnon hashtags like “#Trusttheplan” that were banned from the app starting in July 2020. The altered version of the hashtag has 1.9 million views, according to the app. 

The top video on the hashtag — one of dozens related to the QAnon conspiracy theory — is explicitly pro-QAnon and was posted just five days ago. “Trust the plan, Q! You still don’t believe it? Pay attention,” the video creator says, before a voice reads through a Q post with the letter Q and a GIF of Trump flashes on screen. 

The user who posted the video is a QAnon influencer who is thriving on the app. He has 23,000 followers and posts numerous videos related to QAnon almost every day. One video posted on Tuesday had already been viewed over 50,000 times in the early afternoon. In the video, a Q flashes on screen while the creator puts forward a conspiracy theory about the death of software entrepreneur John McAfee

Other QAnon hashtags on the platform include variations of #Adrenochrome, which is also banned on the platform. QAnon believers baselessly claim that global elites harvest a substance called adrenochrome from children. 

Creators who posted the adrenochrome videos used variations of hashtags that include emojis of blood and misspellings of adrenochrome to dodge TikTok’s attempts to restrict conversation around the term. One misspelled version of the hashtag has 2 million views, according to the platform. In one May video that has over 14,000 likes from a user with over 20,000 followers, text reads “adrenochrome exposed” as actors joke on talk shows about their cosmetic procedures, with one actor joking that he sucks baby blood to Wendy Williams. More text in the video reads, “They disguising it as a joke… but its clear as day.” 

The videos and hashtags, many of which have a significant number of viewers, raise questions about TikTok’s effort to curb misinformation on the platform as the U.S. enters midterm election season.

In July 2020, TikTok attempted to address the growth of QAnon hashtags on their platform by banning a selection of them. In October 2020, the company said it was expanding the ban to all videos on the platform that advance ideas from the conspiracy theory movement. 

In the run-up to the 2022 midterm elections, TikTok said that it will be combating misinformation on its platform. In a blog post published Aug. 17, Eric Han, TikTok’s U.S. head of safety, wrote, “TikTok has a longstanding policy to not allow paid political advertising, and our Community Guidelines prohibit content including election misinformation, harassment — including that directed towards election workers — hateful behavior, and violent extremism.”

A TikTok spokesperson said in an email that the company took down the videos after an inquiry from NBC News.

"We’re removing this content as promotions of QAnon violate our harmful misinformation policies," the spokesperson wrote. "We continue to take steps to make this content harder to find across search and hashtags by redirecting associated terms to our Community Guidelines. We regularly update our safeguards with misspellings and new phrases as we work to keep TikTok a safe and authentic place for our community."