Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Child abuse, violence, assault: Doesn’t Facebook care about its content moderators?

It’s not just about violence and hate speech: democracy is at risk too

Damian Collins,Sean Casten,Phumzile van Damme
Wednesday 13 April 2022 12:35 BST
Comments
This is a golden opportunity to set minimum standards for how social media companies moderate harmful content, in a way that respects and upholds their workers’ rights
This is a golden opportunity to set minimum standards for how social media companies moderate harmful content, in a way that respects and upholds their workers’ rights (PA)

Recent revelations have laid bare Facebook’s continued neglect of contract content moderators around the world. As the Online Safety Bill and other legislation is enacted, we are calling on lawmakers all over to seize this golden opportunity and set minimum standards for how social media companies moderate harmful content, in a way that respects and upholds their workers’ rights.

In 2019, Mark Zuckerberg’s company turned to Sama, a US-based subcontractor, to be its primary content moderation partner across Africa. Sama primarily supports data services for Fortune 50 companies, but heavily promotes its social mission of lifting marginalised people out of poverty by giving “dignified” work. But in reality, reporters discovered, Sama was using these workers to sift through thousands of hours of murders, rapes, suicides, child sexual abuse, and other graphic content, without reasonable pay, training or emotional support.

These kinds of issues are nothing new to Facebook. Time and again, the company’s content moderation practices have led to diagnosed PTSD, and even attempted suicide. The trauma of content moderators working for Facebook’s contractors is well documented, from Accenture in Phoenix, the USA to Cognizant in Hyderabad, India and Arvato in Berlin, Germany.

“Doing Facebook content moderation while working was both emotionally and mentally devastating,” said Daniel Motaung, a former Facebook content moderator at Sama’s Nairobi said. “I went in ok and went out not ok. It changed the person I was.”

And it’s not just violence and hate speech: democracy is at risk too. In 2020 Facebook committed to providing additional resources for better content moderation of electoral disinformation in Africa. “Supporting elections across Africa continues to be a priority, and we’ve dedicated unprecedented resources both locally and globally, with protecting election integrity at the centre of this work,” said Akua Gyekye, the company’s Public Policy Manager for Africa Elections.

But a study conducted by the South African 2021 Elections Anti-Disinformation Project found that Facebook did not deliver on this promise. The company did next to nothing to protect South Africa’s November 2021 election from the dangers caused by electoral disinformation. Is this another impact of the poor working conditions in its Africa moderation office? We think it is.

The stakes of content moderation failures are high – they’ve even proven to be deadly. The conflict in Ethiopia has led to thousands of deaths and over 2 million displaced people in 2021.

And so today, after years of inaction, we write to call on the CEOs of Meta and its contractors to be held accountable. Here’s how:

First, lawmakers around the world should strengthen transparency by including mandated public audits of social media’s supply chains of content moderators in legislation, like the Online Safety Bill and the recently introduced Digital Services Oversight and Safety Act (DSOSA) in the US. Regulation intent on improving online safety must do so for content moderators as much as for anyone else.

Second, platforms must absorb the costs of what it takes to keep its platform safe. This means ensuring content moderators only review harmful content for a limited period each day, paying moderators properly for their intense work, offering actual medical and psychiatric care to current and former moderators, and respecting their international labour rights. All other industries are required to make their services safe, even if that comes at a cost. Facebook and the like should be no different.

To keep up to speed with all the latest opinions and comment, sign up to our free weekly Voices Dispatches newsletter by clicking here

Third, Facebook should release its un-redacted audits of Sama, or explain why if none exist. It should also make its full list of content moderator outsourcing partners public, so human rights organisations can properly scrutinise their practices.

Finally, lawmakers across the globe should protect the whistleblowers who came forward by voicing support for them directly and pledging to use the full force of law and public attention to spare them, their livelihoods, and their families from retaliation. This was one of the major recommendations of the Draft Online Safety Bill (Joint Committee), which the government has accepted, committing to amending the Public Interest Disclosure (Prescribed Persons) Order 2014, to ensure whistleblowers who report information on online safety breaches to Ofcom are protected.

We must safeguard these brave workers who come forward, and we must ensure their revelations have a positive impact on content moderation practices all over the world. Our very future depends on it.

Damian Collins is the Conservative MP for Folkestone and Hythe; Sean Casten is the Democratic Representative from the 6th district of Illinois; Phumzile van Damme is a former Democratic Alliance MP in the National Assembly of South Africa

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in