Social media giants zoning in on far-right extremism as Facebook bans Britain First

Company says Paul Golding and Jayda Fransen crossed line by disseminating 'hate speech designed to stir up hatred against groups in our society'

Joe Sommerlad
Friday 16 March 2018 01:16
Britain First leaders jailed for anti-Muslim hate crime

Facebook's decision to remove the accounts of far-right political group Britain First and its leaders is the latest instance of social media companies cracking down on hate speech after initially expressing a reluctance to engage in censorship.

The company took action days after Paul Golding and Jayda Fransen were jailed for anti-Muslim hate crimes.

Facebook stressed that it remains an "open platform for all ideas and political speech" but said: "There are times though when legitimate political speech crosses the line and becomes hate speech designed to stir up hatred against groups in our society."

It added that content posted on the Britain First Facebook page and those of Golding and Fransen had "repeatedly broken our community standards".

It said: "We recently gave the administrators of the pages a written final warning, and they have continued to post content that violates our community standards. As a result, in accordance with our policies, we have now removed the official Britain First Facebook page and the pages of the two leaders with immediate effect."

Pressure on social media sites to address racist, Islamophobic and homophobic posts has increased in recent years, with the election of Donald Trump as US President raising questions about "fake news" and the vulnerability of social news feeds to manipulation and propaganda.

While tech giants like Facebook, Google, Twitter and YouTube were at first disinclined to accept responsibility for content posted by their users, steps are now being taken to tackle the problem.

Facebook in particular has been criticised for taking too long to remove Britain First despite the group's posts being linked to the murder of MP Jo Cox in 2016 and last summer's Finsbury Park terror attack.

Twitter banned Golding and Fransen in December, a month after videos of Islamophobic violence posted by the latter were retweeted by President Trump, sparking a transatlantic row with PM Theresa May.

Britain First's website is still online and it retains a presence on YouTube that has 60,000 subscribers but no content, drastically reducing the group's ability to spread its message and communicate with sympathisers.

Twitter last year removed blue verification ticks awarded to a number of prominent neo-Nazis and white supremacists in the US like Richard Spencer, who last month complained the site was quietly "purging" his account of followers to limit his influence.

CEO Jack Dorsey meanwhile acknowledged earlier this month that bigotry expressed on Twitter has “real-world negative consequences” and needs to be addressed.

“We have witnessed abuse, harassment, troll armies, manipulation through bots and human-coordination, misinformation campaigns and increasingly divisive echo chambers,” he said. “We aren’t proud of how people have taken advantage of our service, or our inability to address it fast enough.”

London Mayor Sadiq Khan has come forward this week to expose the racist and Islamophobic tweets he has received since taking office, also hoping to cast new light on the problem.

Khan also reminded leading tech companies that they are "not above the law" and had a responsibility to protect their users from inappropriate or dangerous material.

YouTube faced renewed political pressure to clamp down on inflammatory content this week, with a Parliamentary Home Affairs Committee labelling the site "a platform for extremism" on Wednesday over its failure to take down an offensive video by neo-Nazi organisation National Action.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

View comments