YouTube has blocked comments on videos featuring children which “could be at risk of attracting predatory behaviour” following an outcry over paedophiles’ use of the platform to share content.
The video site said it had disabled comments on tens of millions of videos over the past week.
It comes after a vlogger alleged he had discovered a “wormhole” into a “soft-core paedophile ring” and several companies pulled their adverts from the site.
Matt Watson said he found evidence of paedophiles targeting videos of young girls on the site.
He said their comments often included suggestive remarks and pointed out the parts of videos that might show children in compromising positions.
Some comments included links to other, similar, videos which when clicked encouraged YouTube’s algorithms to link the clips so they showed up together in a user’s recommended viewing section, Mr Watson said.
In a statement on its “Creator Blog” on Thursday, Google-owned YouTube said that “the important steps we’re sharing today are critical for keeping young people safe”.
The statement said: “Over the past week, we disabled comments from tens of millions of videos that could be subject to predatory behaviour.
“These efforts are focused on videos featuring young minors and we will continue to identify videos at risk over the next few months.
“Over the next few months, we will be broadening this action to suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behaviour.”
YouTube said a “small number of creators” would be able to retain comment functionality, but would be “required to actively moderate their comments, beyond just using our moderation tools, and demonstrate a low risk of predatory behaviour”.
The site said it had also launched a more effective “comments classifier” which is “more sweeping in scope and will detect and remove two times more individual comments”.
The statement added: “Videos encouraging harmful and dangerous challenges targeting any audience are also clearly against our policies.
“We will continue to take action when creators violate our policies in ways that blatantly harm the broader user and creator community.”
Several major companies including the developer of Fortnite and Disney pulled their advertising from YouTube over fears about how videos of children were being used.
YouTube is wrestling with moderating content on its site and has previously faced claims its algorithm has promoted far-right and extreme videos.
It comes after the video sharing app TikTok agreed to pay a $5.7m (£4.2m) fine to settle allegations it illegally collected personal information from children.
The US Federal Trade Commission said the penalty against TikTok, formerly known as Musical.ly, was the largest ever obtained in a children’s privacy case.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies