A content moderator at TikTok recently filed a lawsuit against the company due to the trauma caused by disturbing graphic videos. and it’s not just the content – according to the lawsuit, moderators are exposed to an enormous amount of it, which seriously affects their mental health.
The moderator Candie Frazier has proposed a class-action lawsuit against TikTok and its parent, ByteDance Inc. She claims that the videos moderators have to screen involve gruesome and disturbing content including child pornography, rapes, beheadings, and animal mutilation. According to Frazier, she also had to moderate scenes of “freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, complete with audio,” Bloomberg reports.
But believe it or not, it gets worse. The lawsuit claims that TikTok’s 10,000 content moderators have to screen an insane amount of content. They work in 12-hour shifts, with a total of only one hour of break. During this time, they watch hundreds of videos of highly disturbing content. “Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time,” Frazier’s lawyers said in the complaint.
TikTok didn’t comment on the ongoing lawsuit. A company spokesperson only issued a statement claiming that the company strives “to promote a caring working environment for our employees and contractors,” as Bloomberg reports.
“Our safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”
Social media platforms like Instagram and Facebook have relied on AI for moderating inappropriate content since 2020. TikTok announced it earlier this year. However, we all know that the system is not always unmistakable: it has censored ancient statues, Baroque paintings, and even a 30,000-year-old statue. This is why a lot of content still goes through human moderation even on the platforms that use AI.
According to the complaint, TikTok, along with Facebook and YouTube, developed guidelines for moderators that would help them cope with the images of child abuse that they view daily on their job. They include providing psychological support for moderators and limiting their shifts to four hours. However, TikTok reportedly failed to implement them.
In her lawsuit, Frazier’s attorney claims that all this has led her to develop PTSD. She is asking for “compensation for psychological injuries” and “a court order requiring the company to set up a medical fund for moderators.”
[via Bloomberg]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!