Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Banning anti-vax groups on Facebook only drives them to Twitter, study suggests

Cross-platform approach is needed for content moderation to work, scientists say

Vishwam Sankaran
Tuesday 23 August 2022 13:27 BST
Comments
Medical vaccine exemptions rose in 2020

Banning groups posting anti-vaccination content on Facebook promoted more such content on Twitter in the following month, according to a new study that may lead to a reassessment of policy measures aimed at tackling vaccine misinformation.

Scientists warn that anti-vaccine content and health-related misinformation on social media have been amplified by the Covid-19 pandemic.

They say the 150 largest anti-vaccine social media accounts have gained at least 7.8 million followers since 2019, growing nearly by a fifth during the period.

While a growing number of social media platforms are taking an active role in content moderation, scientists, including Tamar Mitts from Columbia University in New York, say little is known about the spillover effects of such online speech regulation across platforms.

In a new study, presented at WebSci ‘22: 14th ACM Web Science Conference 2022, researchers analysed the impact of removing groups promoting anti-vaccine content on Facebook on engagement with similar content on Twitter.

As part of the research, scientists followed 160 Facebook groups discussing Covid-19 vaccines and tracked their removal from the platform between April and September 2021.

Researchers also identified users who cited these Facebook groups on Twitter, and then examined their online behavior over time.

The study found that users citing removed Facebook groups promoted more anti-vaccine content on Twitter in the month following the removals.

One of the findings of the study, researchers say, is that users exposed to the removal of anti-vaccination pages from Facebook become more committed to those ideas in their Twitter activity by increasingly posting anti-vax content.

Based on the research, scientists say a cross-platform approach is needed for content moderation to work.

Compared to Twitter accounts citing Facebook groups that were not removed, scientists say users citing the removed Facebook groups used 10-33 per cent more anti-vaccine keywords on the microblogging platform.

“Our results suggest that taking down anti-vaccine content on one platform can result in increased production of similar content on other platforms, raising questions about the overall effectiveness of these measures,” scientists wrote in the study.

Scientists say there is also a need to develop a better model to estimate the likelihood that an idea expressed on one social media platform originated from another.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in