Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook knew its algorithm made people turn against each other but stopped research

Features combating hyperpolarising content would require "a moral stance", but a report suggests Facebook executives including Mark Zuckerberg dismissed projects

Adam Smith
Thursday 28 May 2020 12:19 BST
Comments
Facebook CEO Mark Zuckerberg at the Elysee presidential palace, in Paris, on 23 May, 2018 following a meeting with French President on the day of the Tech for Good summit
Facebook CEO Mark Zuckerberg at the Elysee presidential palace, in Paris, on 23 May, 2018 following a meeting with French President on the day of the Tech for Good summit

Facebook executives took the decision to end research that would make the social media site less polarising for fears that it would unfairly target right-wing users, according to new reports.

The company also knew that its recommendation algorithm exacerbated divisiveness, leaked internal research from 2016 appears to indicate. Building features to combat that would require the company to sacrifice engagement – and by extension, profit – according to a later document from 2018 which described the proposals as “antigrowth” and requiring “a moral stance.”

“Our algorithms exploit the human brain’s attraction to divisiveness,” a 2018 presentation warned, warning that if action was not taken Facebook would provide users “more and more divisive content in an effort to gain user attention & increase time on the platform.”

According to a report from the Wall Street Journal, in 2017 and 2018 Facebook conducted research through newly created “Integrity Teams” to tackle extremist content and a cross-jurisdictional task force dubbed “Common Ground.”

The Common Ground team was intended to tackle polarisation directly, said people familiar with the team, although its intentions were “explicitly not going to build products that attempt to change people’s beliefs,” but to create products that increase empathy, understanding, and humanization of the ‘other side’” according to a 2018 document.

The project was particularly concerned about private groups following a 2016 presentation by a Facebook researcher that found extremist content in more than one-third of large German political groups, most of which were private and so their content was not visible to outside members, on the social media site.

These groups were disproportionately influenced by a small coterie of hyperactive users, with the presentation stating that the majority of people joining extremist groups were encouraged by Facebook’s algorithm. “Our recommendation systems grow the problem,” the presentation said.

While Facebook was dedicated to neutrality, deciding that the social media site should not police users’ opinions or vilify users’ political opponents, according to internal documents, it was suggested that Facebook change its algorithm to suggest a wider range of Facebook groups or adding a feature to create smaller communities within Facebook Groups where political arguments could be held.

However, Facebook executives including chief executive and founder Mark Zuckerberg “largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.”

Facebook's vice president of global public policy Joel Kaplan argued that such actions were “paternalistic” at a time when disapproval from Kaplan’s team would dissolve a project, said people familiar with its development.

Negative reviews from the policy team reportedly stopped a classification system for hyperpolarising content. Efforts to suppress clickbait about politics were also shut down.

Carlos Uribe, who led the newsfeed Integrity Team at the time, attempted to push “Sparing Sharing,” which would have reduced the reach of content from hyperactive users – who were usually far more partisan than average users and engaged in behaviour similar to spammers. Uribe did not comment to the Wall Street Journal, but did confirm that he supported the Sparing Sharing proposal.

Facebook researchers suggested that it could bolster the platform's defence against spam and manipulation like that which had allowed interference from Russia during the 2016 presidential election.

However, Kaplan and other Facebook executives pushed back on Uribe’s proposal, saying that mitigating the ability of Facebook’s most dedicated users to push content would unfairly affect them – hypothesising about a Girl Scout troop who became "super-sharers" by promoting cookies as an example.

While Zuckerberg approved a weakened version of the program, he reportedly said that he was losing interest in the effort to change the platform for the good of its users and asked not to have that subject brought to him again. Smaller changes were made to the platform, including efforts to promote news stories from a large user base and penalties for publishers who shared fake news.

Uribe since left the company because of frustration with executives, and said that while proposals in the US would affect conservatives it would affect opposite groups in other countries. The Common Ground team was disbanded, and while the Integrity Teams still exist many senior staffers either left Facebook entirely or moved to Instagram, which is owned by Facebook.

In response to the report, Facebook has said in a blog post that the Wall Street Journal "wilfully ignored critical facts that undermined its narrative" which, the company says, includes changes to the News Feed, limiting the reach of Pages and Groups that breach Facebook's standards or share fake news, combating hate speech and misinformation, and "building a robust Integrity Team."

This news comes while Donald Trump is preparing to sign an executive order to target social media sites after Twitter added a fact-checking label to his tweets claming that postal voting would result in a "fraudulent" election. The president claimed that social media sites are biased against right-wing voices, a topic which is hotly debated.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in