Internet giants given one hour deadline to take down terrorist propaganda

Google and Facebook argue European Commission's new time limit too short to allow for effective investigation

Joe Sommerlad
Friday 02 March 2018 15:53 GMT
A screenshot from an Isis propaganda video posted online
A screenshot from an Isis propaganda video posted online (Isis)

Internet giants Google, Facebook and Twitter are facing renewed pressure to tackle the problem of terrorist propaganda online after the European Commission (EC) gave them just a one hour deadline to remove offensive content from their pages or face penalties.

The EC's demand comes at a time when the major search and social media companies are being urged to do more to censor inappropriate or illegal material posted by users and hosted on their domains.

According to the new recommendations issued on Thursday, leading web companies must move to take down terrorist material, posts that incite hatred or violence, child sexual abuse videos and images, sites trading in illegal goods or counterfeit products and instances of copyright infringement within 60 minutes of their being uploaded.

"Considering that terrorist content is most harmful in the first hours of its appearance online, all companies should remove such content within one hour from its referral as a general rule," the EC said in a statement.

The commission will also ask companies to report back on the degree of co-operation they receive from other organisations in order to determine whether stricter legislation is necessary.

Most online media companies have clear rules in place warning users against publishing hate speech and routinely investigate and remove troubling content as soon as it is reported by users.

Isis propaganda intended to appeal to Western gamers
Isis propaganda intended to appeal to Western gamers

However, the major players had previously signed up to a 24-hour timeframe for deleting objectionable content and argue that the new proposal leaves too little time to act.

"Such a tight time limit does not take due account of all actual constraints linked to content removal and will strongly incentivise hosting services providers to simply take down all reported content,” the Computer & Communications Industry Association warned in response.

Facebook has recently changed the way topical content is shown in users' feeds to counter the problem of "fake news" while YouTube announced it was hiring 10,000 new moderators in December to more proactively police clips being run on the site.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies


Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in