Facebook removes thousands of pages linked to QAnon conspiracy theory

Nearly 800 groups were removed as the social media giant takes action against dangerous organisations

Adam Smith
Thursday 20 August 2020 09:52
Comments
Trump willing to help QAnon conspiracy theorists 'save the world' from cannibals and paedophiles

Facebook has removed thousands of accounts, groups, pages, and advertisements related to the Qanon conspiracy theory.

The social media giant said that over 790 groups, 100 Pages and 1,500 ads tied to Qanon from Facebook were taken down.

Additional restrictions were placed on over 1,950 Groups and 440 Pages on Facebook, as well as over 10,000 Instagram accounts.

“Those Pages, Groups and Instagram accounts that have been restricted are still subject to removal as our team continues to review their content against our updated policy, as will others we identify subsequently,” the company said.

Qanon is a conspiracy theory believing that an anonymous government insider called “Q” is posting secret codes on the 4chan message board /pol/ about an upcoming “storm” – a great change they believe will happen under the Trump presidency.

They also believe that group of powerful, nefarious US politicians and organisations, including the Freemasons and the Illuminati, are seeking to bring down the Trump administration.

Security researcher Mark Burnett, examining Qanon “codes”, has said that they are “not actual codes, just random typing”, based on an analysis of the letters and numbers used in the communications.

President Trump has endorsed Republican Qanon conspiracy theorist Marjorie Taylor Greene, after she won her Georgia congressional primary.

Facebook has previously taken action against Qanon accounts, banning the "Official Q / QAnon" Facebook group, which had close to 200,000 likes at its peak and millions of engagements with its posts when it was removed.

Other militia organisations, some of which have called for violence and riots, have had 980 groups, 520 Pages and 160 ads removed from Facebook.

Facebook’s decision to restrict these groups comes as it expands its moderation policy to tackle groups which “demonstrated significant risks to public safety but do not meet the rigorous criteria to be designated as a dangerous organisation.”

The company says that it will allow people to post content that supports these movements and groups – as long as it does not break Facebook’s content policies – but will restrict their ability to use Facebook or Facebook-owned properties to organise.

The move comes one month after Twitter took similar restrictive action against the group, shutting down thousands of accounts associated with the movement in “enforcement action on behaviour that has the potential to lead to offline harm”.

Twitter said the accounts had been “engaged in violations of our multi-account policy [and] coordinating abuse around individual victims”, as well as actively trying to avoid suspensions.

This is not the only action Facebook has taken against potentially violent movements. It also banned the extremist Boogaloo network from its platforms, but an investigation found it had been profiting off advertising from that group for months prior.

Register for free to continue reading

Registration is a free and easy way to support our truly independent journalism

By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists

Already have an account? sign in

By clicking ‘Register’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in