Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

YouTube accused of ‘providing platform for extremism’ as white supremacist video resurfaces

Home Affairs Committee says 'search engines are promoting things that further and further radicalise people'

Harry Cockburn
Tuesday 13 March 2018 22:58 GMT
The video showed National Action members marching through Darlington and performing Nazi salutes in November 2016
The video showed National Action members marching through Darlington and performing Nazi salutes in November 2016 (YouTube)

YouTube was accused of providing a "platform for extremism" by an MP after four versions of a video from the neo-nazi National Action group were found on the site more than a year after being flagged up by MPs

William McCants, global counter-terrorism lead for both Google and YouTube apologised before the Government’s Home Affairs Committee.

He said he would make it his “personal mission” to ensure YouTube was not a platform for extremist views and was rid of videos that promoted hate or violence.

It followed a 45 minute grilling by the committee where it emerged that both YouTube and Google have been repeatedly asked to remove the propaganda video, which relates to a white supremacist speech delivered at a National Action demonstration in Darlington in November 2016.

Committee chairwoman Yvette Cooper said it was “utterly disgraceful” that the content from the proscribed group’s supporters was still online despite it being flagged at least seven times.

She added: “The fact is you are continuing to host illegal organisations, you are continuing to collude with these illegal organisations by providing a platform for their extremism.”

Mr McCants said he felt “great personal frustration” over the four videos and insisted YouTube was not a place where material from proscribed organisations could be promoted.

He said the technology system YouTube uses to identify potentially questionable content had worked, but that there had been failings from their human reviewers who had made “the wrong call”.

Part of the problem was that reviewers, who undergo “weeks-long training”, were more familiar with Islamic content and that far-right material was more difficult to recognise, he said.

Three changes have been in place since last week as a direct response, he said.

National Action videos will now be sent to specialist reviewers, general reviewers will get extra training, and the technology to identify problematic material will be fine-tuned.

But Mr McCants could not say where the individuals were based, whether they were YouTube employees or contracted out, or whether they had previously made incorrect decisions.

Ms Cooper said it was “frankly shocking you seem to know so little about who they are”, adding that the committee was “extremely disappointed” at his evidence.

She said: “Do not believe this is the first time you have heard this – allegations and concerns that your algorithms are promoting more and more extreme content at people.

“Whatever they search for, what they get back is a whole load more extreme recommendations coming through the algorithms. You are the king of the search engine and yet your search engines are promoting things that further and further radicalise people.”

In a subsequent tweet she said YouTube’s evidence to the committee was “shockingly weak”.

Press Association contributed to this report

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in