‘It’s a dirty little secret’: The mental health crisis among the internet’s content moderators
As social media platforms grow ever larger, so too has the army of people employed to filter out the most gruesome and extreme content posted to them. The human cost is often great, writes Greg Noone
Horror films were no longer convincing for Max. They seemed timid in comparison to the heinous material he watched daily as a content moderator. Max – not his real name – found the workload heavy, but manageable. By the time he would eventually leave his job reviewing video as a contractor for a major social media platform, so many flagged clips were being deposited through his inbox that the review process for each could only stretch to 30 seconds.
Most of them were benign, mistakenly flagged and easy to filter out. Once in a while, though, the worst of it – an animal abused, a head disconnected – would play peekaboo in the lineup. Over time, Max grew jaded, his skin a little thicker. Occasionally, the worst images would continue to flash in his memory, at great cost to his personal life.
“This one time my girlfriend and I were fooling around on the couch,” before Max’s girlfriend made an innocuous joke and he shut down the conversation. “I doubt, maybe a decade down the line, maybe I will stop encountering these things that bring it up, but who knows?”
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies