A content moderator is suing TikTok and its parent company, alleging that she suffers from “psychological trauma” because they failed to implement safety measures that are standard in the industry.
Her attorney, Steve Williams, said Frazier views “horrific stuff nonstop.” The lawsuit, which seeks class-action status, was filed Thursday in U.S. District Court for the Central District of California.
The suit says she is exposed to posts that include “child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.” She is also subjected to conspiracy theories, distortions of historical facts and political disinformation, the suit says.
The suit says Frazier suffers from post-traumatic stress disorder “as a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace.”
It alleges that TikTok did not warn Frazier that viewing such posts “can have a significant negative mental health impact on content moderators.”
A Telus International job description for a content moderator, which is available online, does say the posts “may include graphic, violent, explicit, political, profane and otherwise disturbing content.”
The job description lists “sound coping, emotion regulation, and stress-management skills” as a requirement. It is not clear whether the job description was available when Frazier applied for the job.
A spokesperson for Telus International, which is not listed as a defendant in the lawsuit, said the company has "a robust resiliency and mental health program in place to support all our team members, as well as a comprehensive benefits program for access to personal health and well-being services."
The spokesperson said “team members can elevate questions and concerns about any aspect of their job through several internal channels.”
Frazier "has never previously raised these concerns about her work environment and her allegations are entirely inconsistent with our policies and practices," the spokesperson added in a statement.
The suit alleges that content moderators working for TikTok and its parent company, ByteDance, are at greater risk of suffering from PTSD because the companies have failed to implement “workplace safety measures.”
“The claim is to say: ‘I want to do my job. I just want to do my job with proper protections,’” Williams said. “Just like any other hazardous work.”
While he acknowledged that the safety measures are not required by legislation, the lawsuit says there are “industry-recognized standards.”
Protocols that other companies and nonprofit groups follow include limiting content moderators’ shifts to four hours. Frazier works for 12 hours, with two 15-minute breaks and an hour for lunch, the suit says.
The Technology Coalition, of which ByteDance is a member, also recommends that content moderators are provided with counseling and are allowed to opt out of viewing child sexual abuse imagery.
The coalition — which also includes Facebook, YouTube, Snap Inc. and Google — says companies “must support those employees who are the front line of this battle,” according to the suit.
The National Center for Missing and Exploited Children also encourages companies to mitigate the effects of disturbing images that employees are subjected to by displaying them in black and white, blurring parts of videos, showing videos in smaller resolutions and muting videos, the suit says.
“ByteDance and TikTok have further failed to implement the standards suggested by the Technology Coalition, despite being a member,” it says.
A spokesperson said that TikTok does not comment on pending litigation but that “we strive to promote a caring working environment for our employees and contractors.”
“Our Safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” the spokesperson wrote in a statement.
The spokesperson shared a page outlining TikTok’s efforts to ensure user safety.