Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

YouTube hiring thousands of staff to stop disturbing videos aimed at children

The company will use technologies developed for terrorism to try and keep children safe

Andrew Griffin
Tuesday 05 December 2017 15:18 GMT
Comments
YouTube themed cupcakes are displayed during Murray SawChuck's 100,000 YouTube subscriber party at Planet Hollywood Resort
YouTube themed cupcakes are displayed during Murray SawChuck's 100,000 YouTube subscriber party at Planet Hollywood Resort (Gabe Ginsberg/Getty Images for Murray SawChuck)

YouTube says it will hire more than 10,000 people in part to address a disturbing trend among its videos.

The new moderators will try and stop the spread of bizarre, potentially damaging videos across its site. In recent weeks, the site has been increasingly criticised for hosting the posts, which seem aimed to target children but in fact show graphic, extremist and violent content.

The videos often pose as containing footage from TV shows for children, concentrate on well known characters, or actively claim to be showing things aimed at kids. But when they click through, they show videos that might not even be suitable for adults – such as Peppa Pig swinging a chainsaw, or children being forced to pretend to be sick.

'Horrifying' number of men view child sex abuse images online, police say

Many of the videos appear to be generated by bots that pick out heavily searched terms and create new videos – many of which turn out to include extreme violence or other disturbing content. Still others are made by people who have been accused of abusing their children on video for clicks.

The site says it is proud of its success in developing software that can identify extremist videos and those linked to terrorism. Now it will use that same technology to search out other problem videos, like those that target children.

Once it finds those videos, it will stop their uploaders being paid ad money and might even take them off the service.

The extra moderators will help identify the videos when they are posted. That will then be fed into machine learning algorithms to make them able to pick them out themselves.

The goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018, YouTube CEO Susan Wojcicki said in one of a pair of blog posts Monday.

"We need an approach that does a better job determining which channels and videos should be eligible for advertising," she said. "We've heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don't demonetize videos by mistake."

In addition, Wojcicki said the company would take "aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether."

The moves come as advertisers, regulators and advocacy groups express ongoing concern over whether YouTube's policing of its service is sufficient.

YouTube is reviewing its advertising offerings as part of response and it teased that its next efforts could be further changing requirements to share in ad revenue.

YouTube this year updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.

Additional reporting by Reuters

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in