Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook allows far-right group Britain First to set up new pages and buy adverts despite vow to combat extremism

Paul Golding used page to spread conspiracy theory over New Zealand terror attack

Lizzie Dearden
Home Affairs Correspondent
Monday 18 March 2019 16:23 GMT
Comments
Paul Golding set up two new pages and paid for Facebook adverts to promote Britain First
Paul Golding set up two new pages and paid for Facebook adverts to promote Britain First (PA)

Facebook has allowed far-right group Britain First to set up new pages and pay for adverts, despite vowing to crack down on extremists.

Days after the social media giant was used to livestream the New Zealand terror attack, The Independent can reveal Facebook anti-Muslim leader Paul Golding set up two new platforms.

One was functioning as Britain First’s official page and had more than 7,300 followers, with Golding posting pictures from a “Britain First defenders” training day and telling people to “pray for churches” in response to the Christchurch mosque shooting.

He shared a link to a conspiracy theory claiming the alleged culprit Brenton Tarrant, a white supremacist who deliberately targeted Muslims, was “linked to left-wing groups”.

The second page, called "authentic Paul Golding", was a personal profile but also named him as the leader of Britain First and linked users to the group's website.

Fiyaz Mughal OBE, founder of Islamophobia monitoring group Tell Mama, said Golding’s pages showed Facebook was failing to stop extremists opening new accounts.

“It makes an absolute sham of claims that social media platforms are getting grips with extremism,” he told The Independent.

“For far too long the far-right have been given the benefit of the doubt for ‘free speech’. We’ve been speaking about it for seven years.”

Mr Mughal accused social media firms of taking a “racialised” approach to extremism and letting the far-right slip through the net.

“We’re dealing with lives here and these guys are dealing with profits,” he added. “This is appalling.”

Golding also paid for two adverts that ran on Facebook in recent days.

Seen by up to 100,000 people, one advert urged viewers to follow Golding’s “official Facebook page”, and the other linked to a Britain First petition.

Golding’s two new pages, including one that had been online for more than four months, were deleted after The Independent requested comment from Facebook.

Both of Paul Golding's Facebook pages listed him as the leader of Britain First and linked to the group's official website (Facebook)

The internet giant has said it works to remove extremist content of all kinds using human moderators and artificial intelligence. But it has faced renewed calls to increase its efforts following the New Zealand attack.

Ben Wallace, the security minister, told MPs the government would shortly be publishing draft laws against “online harm” that will set out clear expectations for internet firms.

“Social media platforms should be ashamed that they have enabled a terrorist to livestream this evil massacre and spread this mantra of hate to the whole world,” he added, in response to an urgent question over the New Zealand attack.

“Enough is enough. We’ve been clear that tech companies need to act more quickly to remove terrorist content and ultimately prevent new content being made available to users in the first place. This must be a wake-up call for them to do more.”

Britain First’s adverts were permitted despite its official Facebook page and those of Golding and co-leader Jayda Fransen being deleted in March last year.

The pair had just been jailed for religiously aggravated harassment, and pressure to remove Britain First pages mounted after its posts were linked to the Finsbury Park terror attack, the murder of Jo Cox and a car ramming that targeted Muslims.

At the time, a spokesperson for Facebook said the official Britain First page and those of Golding and Fransen had repeatedly broken its community standards.

“There are times though when legitimate political speech crosses the line and becomes hate speech designed to stir up hatred against groups in our society,” a statement said.

Britain First leaders jailed for anti-Muslim hate crime

“They have repeatedly posted content designed to incite animosity and hatred against minority groups, which disqualifies the pages from our service.”

Britain First styles itself as a “patriotic political party” and is applying to register with the Electoral Commission in Northern Ireland, while candidates campaign for English council seats.

Founded in Swanley as a splinter group of the British National Party, members became known for launching so-called Christian patrols and mosque “invasions”, while campaigning against immigration and Islam.

Britain First forged links with extreme nationalist movements across Europe, with Fransen attending a march in Poland where she called Islam a “cancer moving through Europe”, adding: “Our children are being bombed, our children are being groomed and our government does nothing.”

The Finsbury Park terror attacker, Darren Osborne, read Britain First posts while planning his attempted massacre of Muslim worshippers, while neo-Nazi Thomas Mair repeatedly shouted “Britain first” while murdering Labour MP Ms Cox.

The group gained international notoriety after videos posted from Fransen’s Twitter account were retweeted by Donald Trump in 2017, sparking a diplomatic row between Downing Street and the White House.

After being released from prison last year, Golding and Fransen focused their activity on Northern Ireland and have been charged with several other offences relating to protests.

Fransen announced she had left Britain First to start an “exciting new chapter” in January, and retains a personal Facebook page.

A Facebook spokesperson said: “Hate speech of any kind is not allowed on Facebook and Paul Golding’s pages have been removed for violating our policies.

“We invest heavily in specialist teams and new technology to identify, review and take down hate speech and in Q3 2018, took action on 2.9 million pieces of hate speech content – 51.6% of which we found before users reported it to us.

"However, there is always more we can do to disrupt people wanting to spread hate online which is why we also collaborate with policymakers, civil society, and others in the tech industry, to help us improve the effectiveness of our approach.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in