Google says it will step up its efforts to stem online extremism by putting more resources into identifying Youtube videos that spread hate.
The announcement comes in the wake of a series of terror attacks. Shortly after midnight on Sunday, a van drove into a crowd outside a North London mosque, killing one and injuring eight others in an attack police are treating as terrorism.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now,” Google said in a blog post.
Google said it would fight extremism on its platforms in four ways.
First it will invest in artificial intelligence that can more effectively identify extreme content. The company will also double the number of independent experts it uses to flag inappropriate videos.
For content that does not clearly violate Youtube’s policies but nonetheless might inflame religious or other tensions, Google will use a warning that shows before a video loads. The content will still be hosted but it will not receive advertising revenue.
Lastly, the company says it will promote videos that speak out against hate and target adverts anti-radicalisation adverts at those that may be susceptible to joining groups such as Isis.
“Collectively, these changes will make a difference. And we’ll keep working on the problem until we get the balance right,” the company said.
“Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free. We must not let them.”
Anti-hate groups and politicians have been critical of Google and social media sites, which they say have not taken strong enough action to stem the spread of radical material online.
A host of big firms as well as the UK Government, halted advertising on Youtube earlier this year after it emerged that their adverts were appearing alongside extremist material, and therefore potentially funding terrorism.
Labour’s Yvette Cooper, chair of the Commons home affairs select committee, welcomed the announcement. “This is a very welcome step forward from Google after the [committee] called on them to take more responsibility for searching for illegal content,” she said.
“The select committee recommended that they should be more proactive in searching for and taking down illegal and extremist content, and to invest more of their profits in moderation.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies