TikTok content moderator sues company after exposure to graphic videos caused PTSD

62
SHARE
facebook content moderators

Parent company ByteDance Inc. is facing a class-action lawsuit from TikTok content moderators. The suit was filed Thursday in Federal Court in Los Angeles against the video-sharing platform. And is expected to of other content moderators join the class-action. 

Candie Frazier’s complaint is that TikTok is providing no support to the TikTok content workers. She says the company is well aware of the risks to its workers mental health. 

Frazier, who says she suffers from PTSD asking for compensation for psychological harm. And requests a court order requiring the company to set up a medical fund for moderators.

Signup for the USA Herald exclusive Newsletter

The same scenario played out in 2020 with both YouTube and Facebook. And each ended up paying out tens of millions to ex-workers who developed PTSD working for the social media platforms.

Frazier says her job was to screen videos. And remove those that violate the platform’s rules. The videos she saw included unspeakable horrors.

Frazier claims the videos she was exposed to include “freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, complete with audio.”

There are currently 10,000 content moderators working for TikTok. These moderators are regularly exposed to a child pornography and violence. And rapes, beheadings, animal mutilation, and self-harm were commonly seen according to the lawsuit.

They must watch hundreds of offensive videos every 12-hour shift. And workers only get an hour off for lunch and two 15-minute breaks. And they are also permitted only 25 seconds per video and often must view three to ten videos at once.

“Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time,” her lawyers with Joseph Saveri Law Firm, LLP said in the complaint.

Guidelines ignored for TikTok content moderators 

TikTok was a member of a group of social media companies including Facebook and YouTube that developed guidelines. And the regulations were to protect moderators from the effects of exposure to violent the images.

TikTok is accused of ignoring the policies they helped create. And of telling new hires that they will be protected by the guidelines that are not really in-use.

The guidelines require psychological support and limiting shifts to four hours in order to help moderators cope. And the guidelines were developed since exposure to violent and disturbing videos has been proven to cause post-traumatic stress disorder (PTSD). 

According to the complaint, “ByteDance and TikTok are aware of the negative psychological effects that viewing graphic and objectionable content has on Content Moderators. Despite this knowledge, they have not implemented safety standards known throughout the industry to protect their Content Moderators from harm.”

“Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares,” according to the complaint. 

TikTok issued a statement saying it cannot comment on pending litigation. And has always tried “to promote a caring working environment for our employees and contractors.”

“Our safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” a company spokesperson said.