TikTok content moderator sues platform for psychological trauma

The TikTok moderator was tasked with keeping harmful content off the platform, including videos of rape and murder.
TikTok is facing a critical content moderation problem. (Representational Photo)
TikTok is facing a critical content moderation problem. (Representational Photo)

A former content moderator at TikTok has filed a lawsuit against the platform alleging that parent company ByteDance provides inadequate safeguards to protect moderators’ mental health against a near-constant onslaught of traumatic footage.

Candie Frazier proposed a class-action lawsuit filed in the California Central District Court, saying she spent 12 hours a day moderating videos uploaded to TikTok for a third-party contracting firm named Telus International and witnessed “thousands of acts of extreme and graphic violence,” including mass shootings, child rape, animal mutilation, cannibalism, gang murder, and genocide.

Frazier pointed out that the volume of content uploaded on TikTok was so high that the moderators had to watch three to ten videos simultaneously, with a new video being uploaded every 25 seconds.

She pointed that moderators were only allowed to take one 15 minute break in the first four hours of their shift, and then additional 15-minute breaks every two hours afterward. ByteDance monitors performance closely and “heavily punishes any time taken away from watching graphic videos.”

The lawsuit remarks TikTok and its partners have failed to meet industry-recognized standards intended to mitigate the harms of content moderation. Offering moderators more frequent breaks, psychological support, and technical safeguards like blurring or reducing the resolution of videos under review could mitigate the harms of moderation.

Frazier says she has suffered “severe psychological trauma including depression and symptoms associated with anxiety and PTSD.” The lawsuit says Frazier has “trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She has severe and debilitating panic attacks.”

The testimony in Frazier’s lawsuit fits reports of content moderators working for other big tech companies like Facebook, YouTube, and Google. Reports like Frazier’s suggest that despite the extra attention, moderators work under critically challenging work conditions.

Related Stories

No stories found.

X
The New Indian Express
www.newindianexpress.com