Facebook finds 111 accounts responsible for majority of anti-vaccine content

Internal report shows researchers worries over ‘substantial’ harm caused by so-called ‘grey zone’ for vaccine misinformation

Gino Spocchia
Wednesday 17 March 2021 12:30
Comments

Donald Trump is only living president absent from pro-vaccine advert

Leer en Español

Facebook has found 111 accounts responsible for a majority of the anti-vaccine content found on the social media site.

The finding follows an internal study of vaccine misinformation shared by Facebook users, which was seen by the Washington Post.

While false statements surrounding vaccines are already banned, thousands of pieces of content were described as being in a “grey area” for algorithms and moderators.

The 111 accounts responsible for anti-vaccine content were not named in the report, according to the Post, and were found by placing the firm’s US users into 638 “population segments” or groups.

Facebook did not identify which users were in the groups, or the reasoning for their categorisations — although they were said to be around 3 million in size.

Of those segments, as few as 10 groups were found to contain 50 percent of all anti-vaccine content on the platform — otherwise categorised as posts with “vaccine hesitancy”, or “VH”.

Read more:

And in the segment with the most anti-vaccine content, only 111 users were found to be responsible for half of all vaccine hesitant content found across Facebook.

It was not clear if Facebook took action against the 111 accounts identified for sharing “vaccine hesitancy”, despite the firm finding a connection to the QAnon conspiracy.

According to the Post, Facebook researchers wrote that “It’s possible QAnon is causally connected to vaccine hesitancy,” with mistrust also shown towards authority in the QAnon online community.

Researchers were also reportedly worried that “VH” content — although not found to have broken rules by Facebook’s algorithm — could be harmful for certain groups of people.

“While research is very early, we’re concerned that harm from non-violating content may be substantial,” Facebook’s study said.

Recently under fire for allowing false claims about coronavirus to be shared on its site, Facebook was last year found to have played a part in the spread of a misinformation-filled documentary called “Plandemic”.

Facebook saysit has removed both vaccine disinformation and content connected to the QAnon conspiracy following the approval of a number of vaccines in the UK and US, and the storming of the US Capitol, in recent months.

In a statement, a spokesperson for Facebook told The Independent that “Since the start of the pandemic, we have partnered with more than 60 global health experts and have studied content related to Covid-19 including vaccines and misinformation, to inform our policies.“

“We routinely study things like voting, bias, hate speech, nudity, and Covid — to understand emerging trends so we can build, refine, and measure our products.”

The spokesperson added that research could also “help to inform our efforts” in removing false claims shared on the site.

Register for free to continue reading

Registration is a free and easy way to support our truly independent journalism

By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists

Already have an account? sign in

By clicking ‘Register’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in