Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

What has Facebook done to fix election interference since 2016 – and will it work?

Measures introduced by CEO Mark Zuckerberg have attempted to rid the platform of malicious actors

Adam Smith
Thursday 21 May 2020 12:18 BST
Comments
Mark Zuckerberg ‘pretty confident’ Facebook can protect integrity of 2020 election

In his first UK broadcast interview for five years, Facebook CEO Mark Zuckerberg said that the company has “learnt a lot about how politics works online” since 2016.

The company was infamously criticised for its use as a platform by Russian-linked agents, who shared Facebook posts with 126 million Americans in an attempt to swing the election for now-President Donald Trump. One of the links was the Kremlin-directed Internet Research Agency (IRA) which, between 2015 and 2017, flooded social media with “false reports, conspiracy theories, and trolls.”

“One big area that we were behind on in 2016 but I think now are quite advanced at is identifying and fighting these co-ordinated information campaigns that come from different state actors around the world,” Zuckerberg said. “I feel pretty confident that we are going to be able to protect the integrity of the upcoming election.”

Since 2016, the company has made numerous decisions in order to try and quell the influence of bad actors on its platforms – which also include Instagram and WhatsApp.

This has included creating an archive for political adverts with information about ad impressions and spend, as well as demographic data such as age, gender and location. There is also now a verification process in place for advertisers, while “millions” of fake and suspicious accounts have been banned.

In 2019, the company said it would take action against content that “does not directly violate our Community Standards, but still undermines the authenticity of the platform” by reducing its distribution in the main News Feed and adding in contextual information next to the post so people can better access details about the publisher. These actions continued into 2020 ahead of the upcoming US election.

Facebook will pay $52 million to content moderators who developed PTSD on the job

However, while the question of whether the company is prepared for the upcoming election remains to be seen, evidence suggests that some problems still remain. During the UK election in 2019, the company refused to remove doctored videos by the Conservatives, which criticised the now-leader of the Labour Party and then-Brexit spokesperson Kier Starmer.

Rebecca Stimson, Facebook’s UK head of public policy, said at the time that “we don’t believe a private company like Facebook should censor politicians. This is why we don’t send content or ads from politicians and political parties to our third-party fact checking partners”.

Even when the company said it would partner with fact-checkers, it still said it would not remove misinformation – a particularly troubling aspect when a study found that 88 per cent of all Conservative adverts were found to be misleading compared to seven per cent of Labour’s adverts.

When Conservative Party adverts were removed, where clips from BBC news reports were used out of context, it was done on the grounds that it violated intellectual property rather than because the government was proactively spreading disinformation.

Writing in 2019 Facebook’s ex-head of global elections integrity ops Yaël Eisenstat made scathing criticisms of the social media giant, stating that “the real problem is that Facebook profits partly by amplifying lies and selling dangerous targeting tools that allow political operatives to engage in a new level of information warfare.”

She added: “It’s clear that the company won’t make the necessary fixes without being forced to, either by advertisers who refuse to spend money on their platforms until Facebook cleans up the spread of misinformation and other harmful content; employees who continue to demand accountability and responsibility from their leaders; and most immediately, government action.”

An ex-employee of Cambridge Analytica – the firm which scraped user data in breach of Facebook’s terms and conditions to tilt the election in the Republican’s favour – said that Facebook has not learned its lessons since the 2016 election.

“Of the billions of dollars that are being spent on the American elections, most of the political advertising money will flow to Facebook and – based on what I saw at Cambridge Analytica – none of it, as far as I am aware, will be fact-checked. No content will be blocked or removed, even if it’s found to be demonstrably false” Cambridge Analytica whistleblower Brittany Kaiser wrote.

“Fake news ads from the Trump campaign about his political rival Joe Biden were blocked by CNN, but have been hosted on Facebook for months racking up millions of views by impressionable voters.”

Facebook's work since the last US elections has attempted to walk the fine line between freedom of speech and ensuring its platform is free from misinformation and disinformation, but it remains unclear whether or not it has been enough to protect democracy.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in