Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Misinformation, violence and social media: Five things we learned from the Facebook files

Company faces SEC complaint alleging it misled investors about its harms

John Bowden
Tuesday 26 October 2021 04:11 BST
Comments
Facebook lets countries reach crisis-point, says whistleblower
Leer en Español

Facebook is back under the gaze of international scrutiny following the release of hundreds of documents provided to news outlets by Frances Haugen, a former data scientist with the company.

A series of news reports has led to criticism cascading down on the company from all sides over issues ranging from Covid-19 to politically-motivated violence and criminal gang activity. Facebook’s critics say the company does little to negate the problems that arise on its platforms, and does more to hide concerns from its own employees about those same issues.

The files, originally obtained by Ms Haugen while still at the company, are part of her complaint to the Securities and Exchange Commission (SEC) alleging that the company misled investors about the negative effects of its platforms on individuals and society.

Ms Haugen’s initial release of documents came in the form of stories published by The Wall Street Journal; the explosive claims about Facebook putting profit over efforts to fight viral misinformation and divisive content led to a hearing on Capitol Hill, where lawmakers urged CEO Mark Zuckerberg to testify in response.

In the last few days, a consortium of other news media outlets have released their own stories about the documents as The New York Times reports Ms Haugen invited a larger number of media companies — without telling the Journal ahead of time — to access the documents and pick up stories missed by the Journal in its initial run.

Here’s a brief summary of some of the most important findings from the leaked Facebook files:

Facebook relaxed after the 2020 election was over, contributing to the proliferation of misinformation

After the November election came and went last year, executives at Facebook relaxed as they apparently assumed the worst of the site’s issues with US political content were subsiding.

Documents reviewed by numerous media organisations show that the company put efforts to fight false statements made by politicians and others on the backburner, including the shutdown of its civic integrity team, which was reported at the time. As a result, the company’s employees struggled with how to respond to the growing “Stop the Steal” movement and its associated groups on Facebook, which directly led to the conspiracies spreading ahead of the Capitol riot on 6 January.

The company has argued that the team’s roles were assumed by others at the company; however Ms Haugen (who was on the team herself) has said that she and her colleagues felt betrayed by the move, leading some to leave the company.

The company reinstated a number of safeguards on 6 January and in the hours leading up to the attack on the Capitol, but by then it was too late.

The company continues to struggle with content in areas where its moderators do not have experience with local languages

Facebook remains widely popular in countries around the world, raising a number of content-related challenges for the company as it has pursued new users. The company lacks or faces shortages of moderators for content in numerous languages, and employs some with insufficient local knowledge for the moderation of such content, according to internal documents.

The result? In some languages, common words or phrases including the names of major religious landmarks have been permanently or temporarily banned, while hate speech and misinformation have flourished among other communities where moderation is absent entirely.

The company has continued to face issues it acknowledged publicly in 2018 when it explained that it had not done enough to stop content encouraging violence against Rohingya Muslims in Myanmar. Similar issues are occurring today in Ethiopia, according to the documents, where the country’s Tigray minority faces violence and persecution.

Facebook officials directly addressed this criticism in a statement to CNN, explaining that it was working to staff up in Ethiopia and other countries judged to be hotspots for political violence: "Over the past two years we have actively invested to add more staff with local expertise, operational resources and additional review capacity to expand the number of local languages we support to include Amharic, Oromo, Somali and Tigrinya. We have worked to improve our proactive detection so that we can remove more harmful content at scale. We have also partnered extensively with international and local experts to better understand and mitigate the biggest risks on the platform.”

The company’s platforms continue to be used by organised crime, even violent drug cartels

Across the world, violent criminal organisations continue to use Facebook’s platforms to organise and do business. Recent reports from the Journal and CNN have detailed the ongoing use of Instagram and Facebook by the Jalisco New Generation Cartel, headquartered in Mexico, as it has posted videos of executions and recruitment messages on the platforms.

Human trafficking is another persistent issue on the platform, tied in some cases to major criminal groups and in others to smaller-scale criminal activities. Internal documents first reported by CNN revealed that the company has known internally about significant amounts of human trafficking occurring on Instagram and Facebook since at least 2018, and has struggled to effectively deal with the problem. One report, first detailed by CNN, was distributed internally at Facebook and claimed that its platforms enable “all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks”.

The news agency was also able to track down active advertisements for human trafficking victims on the platform as recently as last week, despite the company’s efforts to tamp down on such content.

"We prohibit human exploitation in no uncertain terms," said a company spokesperson. "We’ve been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform."

Facebook pursued products aimed at attracting young teens to its platforms, despite knowing risks

Facebook’s subsidiary Instagram announced (and later cancelled) a flagship social media site for younger users, titled Instagram Kids, earlier this year. The company’s reasoning for pausing the project was unclear, but essentially blamed an onslaught of criticism for its stoppage.

At the time, the company was facing questions from lawmakers on Capitol Hill on the issue; those questions are only expected to grow as documents revealed by Ms Haugen show that the company has held on to research for years that documented the harms done to children who use its platforms.

One internal data point the company knew about involved body image issues and teen girls; a presentation slide reportedly detailed nearly a third of teen girls who experienced such negative feelings said that “Instagram made them feel worse” about their body.

Despite this, the company ploughed ahead with plans to expand the company’s reach among younger users; in response to criticism, it has claimed that teens lie about their age when using the platforms when age restrictions are in place.

Across nearly every issue, management clashed with rank-and-file workers

Whether it was Covid-19 misinformation, lies about the November 2020 US election, or political violence around the world, Facebook appeared to face an internal divide on the issue, with management frequently ignoring calls from their own employees to do more to resolve problems on the platforms.

One employee unloaded their criticism against management in the hours after the 6 January attack, writing in an internal blog post: "How are we expected to ignore when leadership overrides research based policy decisions to better serve people like the groups inciting violence today?

"Rank and file workers have done their part to identify changes to improve our platform but have been actively held back,” the employee’s damning post continued.

Another employee summed it up in another internal post, this one obtained by WIRED: “I think Facebook is probably having a net negative influence on politics in Western countries,” said the employee, adding: “ I don’t think that leadership is involved in a good-faith effort to fix this.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in