JAKARTA - A former TikTok moderator has sued the social media and its parent company ByteDance, for reportedly failing to protect his mental health after he had to watch hours of traumatizing videos.

In a class-action lawsuit filed by former TikTok moderator Candie Frazier, she claims she has played videos involving cannibalism, accidents that result in shattering of parts, school shootings, suicides, and even falling from a building.

According to Frazier's lawsuit, 10,000 TikTok content moderators are constantly exposed to child pornography, rape, beheading and animal mutilation.

"Plaintiff had trouble sleeping and while she was sleeping she had terrible nightmares," the lawsuit states.

Adding to the problem, TikTok allegedly requires moderators to work 12-hour shifts with only one hour of lunch and two breaks of just 15 minutes.

"Due to the sheer volume of content, content moderators are allowed no more than 25 seconds per video, and simultaneously view three to ten videos at the same time," Frazier said in his lawsuit.

On the one hand, TikTok does have rules to help moderators deal with child abuse and other traumatic images. Among other things, the company limits moderator shifts to four hours and provides psychological support. But Frazier's lawsuit says TikTok failed to implement those guidelines.

Content moderators take on the burden of watching traumatic videos that appear on social media, ensuring that users don't have to see and experience them. In fact, big tech companies like TikTok have to admit on a consent form that the job can cause post-traumatic stress disorder (PTSD).

Frazier also claims that the social media giant lacks technical safeguards such as blurring or reducing the resolution of distracting videos that moderators must watch.

With the class action lawsuit, Frazier expects TikTok to pay him and other content moderators for the psychological harm they suffered. He also wants courts to order companies to set up medical funds for content moderators.

A TikTok spokesperson said they could not comment on the ongoing litigation. However, TikTok claims that it works hard to promote a caring work environment for our employees and contractors.

"Our Safety Team is partnering with third-party companies in the important work of helping protect the TikTok platform and community, and we are continuing to expand the range of healthcare services so moderators feel supported mentally and emotionally," said a TikTok spokesperson.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)