Culture
Former moderators sue TikTok for lack of mental health support
“Somebody has to suffer and see this stuff so nobody else has to.”
Two former TikTok moderators are suing the company for the lack of support they received while completing the arduous and often graphic work of reviewing questionable videos on the platform. Both Ashley Velez and Reece Young are hoping the suit will be granted class-action status, which would leave space for other TikTok moderators to jump on the lawsuit as well.
Velez expounded upon the severe emotional toll of TikTok moderating in an interview: “We would see death and graphic, graphic pornography. I would see nude underage children every day. I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight.”
The lawsuit alleges that TikTok broke California labor laws by not providing mental health treatment for moderators. Though both Velez and Young were actually hired as contractors through other companies, it was TikTok and its parent company ByteDance that pushed moderators to hit certain quotas.
We can only hope the lawsuit forces TikTok to re-evaluate how it treats moderators. But TikTok isn’t the only platform finding its moderators with work-related mental health issues. This is just one piece of a much larger conversation.
Pushed past the limits — It’s no secret that social media moderators spend all day looking at potentially disturbing content. This is what they sign on to do. Neither Velez nor Young feel they were accurately briefed on just how intense their work would be, though, especially as the sheer amount of content on TikTok has increased.
The problem, according to this suit, is that TikTok has not taken measures to make this work any less taxing. If anything, the company did quite the opposite; moderators aren’t actually allowed to discuss what they’ve seen, not with anyone, thanks to compulsory non-disclosure agreements. Moderators are hit with very high quotas, the lawsuit says, and are only allowed two 15-minute breaks in a 12-hour shift.
A reckoning — Moderation is one of social media’s most tangled conundrums. Human moderators are necessary for weeding out harmful content, but the companies behind these platforms often take shortcuts to keep the moderation machine turning. Under-staffed and under-supported, moderation teams end up putting far too much on the shoulders of each individual worker.
TikTok is not the first platform to face this issue. YouTube moderators have brought to light similar issues at Google, with more than a handful being diagnosed with PTSD. Facebook, caught up in similar allegations, agreed to pay a total of $52 million to current and former moderators.
Advances in artificial intelligence can relieve some of the moderating burden, but our algorithms are nowhere near smart enough to drop humans out of the equation yet. If lawsuits are what it takes to force social media companies to really confront their failures to protect workers, then we say keep them coming.