In the lawsuit that was filed on Monday, the plaintiff, who is referred to as “Jane Doe” throughout the document, claimed that they were required to watch videos of murder, child rape, abortions and suicide, during their time working for YouTube.
The plaintiff worked for the site as a content moderator for contracting firm Collabera from 2018 to 2019 and said they experienced nightmares and developed a fear of crowded places, due to the content they were forced to watch, according to CNBC.
The suit claims that YouTube’s wellness coaches were unavailable for employees who worked in the evening, and were not licensed to give medical advice for those on the day shift.
Moderators were also forced to pay for their own medical treatment when they sought professional help to process what they watched during their working hours, the suit alleges.
The plaintiff claimed that a majority of content moderators only stay in the position for a year or less, due to the nature of the job, and said that this leaves the company “chronically understaffed.”
This has led to a culture of overtime, and caused many moderators to exceed YouTube’s four hour daily limit on viewing content for review, which is designed to protect the employees, according to CNBC.
The moderators are also expected to have an “error rate” of between two to five per cent, on the 100 to 300 pieces of content they review daily, the suit claims.
YouTube, which is owned by Google, allegedly also decides whether moderators view content that is blurred and for how long they have to watch it for as part of their review process, which means employees often do not know what they will have to look at.
Joseph Saveri Law Firm, which is representing the plaintiff, filed a similar lawsuit against Facebook earlier in the year, which resulted in the social media site paying out $52m ($40.8m) to moderators in May.
YouTube recently switched back to human moderators, after it programmed AI to review the content on its site earlier in the year, as 10,000 employees were unable to work in the office due to lockdowns forced by the coronavirus pandemic.
The AI system was programmed to be cautious, which led to content that was close to breaking YouTube’s rules being removed from the site.
This led to double the amount of videos as normal being removed from the site between April and June, despite a significant amount of them not actually being in violation of YouTube’s rules, according to the Search Engine Journal.
More than 11 million videos were removed from YouTube between April and June, while at least 160,000 were reinstated after creators appealed and some human content moderators were back reviewing content.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies