Twitter says it ‘should not be a wholly nice place’ amid questions about Facebook, free speech and social media

The company said that it is a ‘mirror on society’ and has to reflect important conversations such as #MeToo and Black Lives Matter

Adam Smith@adamndsmith
Tuesday 27 April 2021 17:23

Twitter has said that its website should not be a “wholly positive or nice place” in answers given to a Parliament committee about website moderation.

“Twitter gives everyone the right to speak publicly”, Katy Minshall, head of UK public policy at Twitter, “and that’s been a huge societal shift over ten, 15 years”.

“As Twitter reflects a mirror on society, that doesn’t mean Twitter should always be a wholly positive or nice place. Twitter at its best has sometimes been the exact opposite of that with conversations of huge importance”, she continued, referencing conversations on the platform about the #MeToo movement or Black Lives Matter.

Ms Minishall gave this answer in response to questioning by Baroness Bull who, among other members of the House of Lords’ Communications and Digital Committee, was questioning both Twitter and Richard Earley, UK public policy manager at Facebook, about whether civility on social media was antithetical to its business models.

She also referenced Twitter’s test of a prompt that would pop up if a user was attempting to post “harmful” language.

Twitter would suggest the reply “when things get heated” and offer the user the option to “revise [their] reply”. While the social media company gave no indication of exactly what that language would be, it is suggested that it will be comparable to other posts which have been reported by users.

Baroness Bull then asked whether the social network had conducted research as to whether this was something users wanted.

“That’s a good question”, Ms Minishall said, saying that the outcome the company was looking for was whether users were “mindful about what they’re tweeting”. The main question for Twitter, Ms Minishall said, was whether users were incentivised to act in a way Twitter wants to see - but gave no conclusive answer about whether this was in users’ interests.

Mr Earley’s answers to the question focused more on users, explaining that it was “categorically not the case” that Facebook encouraged or desired harmful content on its platforms and that the company “survey[s] users constantly” to design its apps and websites.

“Users don’t want to see uncivil content, advertisers don’t want to see it, Facebook doesn’t want it”, he said. This, however, is contradicted by an infamous graph shared by CEO Mark Zuckerberg in 2018 where content gathered more engagement as it approached the point at which Facebook’s community standards would require its removal. Facebook had to change its algorithm specifically in order to reduce the spread of such content and invert the graph.

Earley also claimed that, according to Facebook surveys, 50 per cent of people who see hate speech on the platform are less likely to engage with it; he also referenced the advertiser boycott as an example of pressure Facebook feels when it does not remove uncivil content.

The Facebook representative, however, did not reference the fact that a leaked recording of CEO Mark Zuckerberg dismissed the threat of a boycott.

“We’re not gonna change our policies or approach on anything because of a threat to a small percent of our revenue, or to any percent of our revenue,” Mr Zuckerberg said, according to The Information. The accuracy of the transcript was later confirmed by a Facebook spokesperson to The Guardian.

Twitter also clarified decisions around its removal of former president Trump while keeping the account of Ayatollah Khamenei.

“Israel is a malignant cancerous tumor in the West Asian region that has to be removed and eradicated: it is possible and it will happen”, Khamenei tweeted in 2018. “We will support and assist any nation or any group anywhere who opposes and fights the Zionist regime, and we do not hesitate to say this”, he said in 2020.

Ms Minishall said: “Tweets that were sabre-rattling, we would err on the side of leaving those tweets up”, because of their relevancy to public interest and to give people the opportunity to hold that leader to account.

However, in the subsequent two years that have passed since the policy was implemented, Minishall has said that it is reviewing the approach and is conducting a public consultation.

Facebook’s Oversight Board, a moderation committee that will oversee challenging content issues for the social network, is currently undergoing a similar process for Mr Trump’s ban from Facebook and Instagram – although a final decision is not expected for “weeks”.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

View comments