TikTok is rolling out a content maturity rating system to prevent young audiences from seeing videos deemed inappropriate, along with new features that will allow viewers to customize their For You pages.
Like maturity ratings used for movies and video games, the content rating system will organize videos based on “thematic maturity,” TikTok announced Wednesday. In an “approach unique to TikTok,” the platform will flag videos that contain “mature or complex themes” and assign it a “maturity score.” The video’s rating will determine whether or not TikTok viewers under 18 can view it while scrolling through their feeds.
TikTok’s Community Guidelines won’t change, but how it shows content to certain audiences will.
“Within these strict policies, we understand that people may want to avoid certain categories of content based on their personal preferences,” TikTok’s Head of Trust and Safety Cormac Keenan said in the announcement. “Or, for our teenage community members, some content may contain mature or complex themes that may reflect personal experiences or real-world events that are intended for older audiences.”
Fictional scenes deemed “too frightening or intense for younger audiences,” for example, won’t appear on the For You pages of users under 18.
TikTok has been criticized for exposing children to inappropriate content. Last year, the European Consumer Organization (BEUC) filed a complaint against the company, alleging that TikTok failed to protect underage users from hidden advertising and “potentially harmful” videos. In a press release, the BEUC accused TikTok of “failing to conduct due diligence” in protecting children, since suggestive videos are “just a few scrolls away.”
It’s unclear how TikTok will verify whether or not a user is 18. TikTok users must be at least 13 to use the full version of the app, according to its terms of service. Users in the U.S. under 12 years old are placed into the “TikTok for Younger Users experience,” which allows for parental controls and limited interaction with other users. An EU-funded study published this year concluded that social media age checks are largely ineffective, because users across a variety of platforms can self-report any age. TikTok users only have to provide a photo ID when requesting to change their age on an existing account, not when signing up for one.
In addition to safeguards to prevent young viewers from seeing mature content, TikTok is updating its algorithm to diversify recommendations.
The app shows users videos based on content they’ve engaged with before, from renter-friendly DIY projects to divisive conversations about current events. To prevent users from getting stuck seeing videos about the same topics over and over again, TikTok is introducing a tool that allows viewers to filter out videos with certain keywords or hashtags.
Continuing to see videos about certain topics can be tiresome at best and triggering at worst. Content about sensitive topics like eating disorders, mental health and domestic abuse, for example, is nearly impossible to avoid while scrolling on TikTok. Last year, TikTok began testing ways to prevent users from seeing content that “may be fine as a single video but potentially problematic if viewed repeatedly.”
Viewers will likely see fewer videos about dieting, extreme fitness, sadness and “other well-being topics” that could be “challenging or triggering viewing experiences.” In its newsroom post, TikTok acknowledged the nuance involved in moderating sensitive content — the platform has a history of suppressing content by creators who are “vulnerable to cyberbullying” like disabled and LGBTQ creators. TikTok has also been criticized for disproportionately flagging and taking down content by Black creators.
“As we continue to build and improve these systems, we’re excited about the opportunity to contribute to long-running industry-wide challenges in terms of building for a variety of audiences and with recommended systems,” Keenan continued. “We also acknowledge that what we’re striving to achieve is complex and we may make some mistakes.” like disabled and LGBTQ creasaidTok has also been criticized for disproportionately flagging and taking down content by Black creators.
"As we continue to build and improve these systems, we're excited about the opportunity to contribute to long-running industry-wide challenges in terms of building for a variety of audiences and with recommended systems," Keenan continued. "We also acknowledge that what we’re striving to achieve is complex and we may make some mistakes."