Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

‘It’s a dirty little secret’: The mental health crisis among the internet’s content moderators

As social media platforms grow ever larger, so too has the army of people employed to filter out the most gruesome and extreme content posted to them. The human cost is often great, writes Greg Noone

Wednesday 04 September 2019 18:18 BST
Comments
Content moderation has been around almost as long as the internet itself
Content moderation has been around almost as long as the internet itself (Washington Post/Getty)

Horror films were no longer convincing for Max. They seemed timid in comparison to the heinous material he watched daily as a content moderator. Max – not his real name – found the workload heavy, but manageable. By the time he would eventually leave his job reviewing video as a contractor for a major social media platform, so many flagged clips were being deposited through his inbox that the review process for each could only stretch to 30 seconds.

Most of them were benign, mistakenly flagged and easy to filter out. Once in a while, though, the worst of it – an animal abused, a head disconnected – would play peekaboo in the lineup. Over time, Max grew jaded, his skin a little thicker. Occasionally, the worst images would continue to flash in his memory, at great cost to his personal life.

“This one time my girlfriend and I were fooling around on the couch,” before Max’s girlfriend made an innocuous joke and he shut down the conversation. “I doubt, maybe a decade down the line, maybe I will stop encountering these things that bring it up, but who knows?”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in