Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

UK could ban social media companies over self-harm content, health secretary says

'Step up and purge this content once and for all'

Zamira Rahim
Sunday 27 January 2019 16:35 GMT
Comments
Health secretary Matt Hancock: 'we must legislate' social media companies

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Britain could ban social media companies that fail to remove harmful material, health secretary Matt Hancock has said.

The politician wrote to Twitter, Snapchat, Pinterest, Apple, Google and Facebook after the death of Molly Russell, a 14-year-old-girl who had been viewing material online linked to depression, self-harm and suicide.

“We can legislate if we need to,” he said, when asked about websites where content promoting self-harm and suicide can be found.

“It would be far better to do it in concert with the social media companies but if we think they need to do things that they are refusing to do then we can and we must legislate.”

The health secretary was asked, during his appearance on the BBC’s Andrew Marr Show, if the UK would go as far as banning or imposing extra taxes on websites that failed to remove harmful content.

“Ultimately parliament does have that sanction, yes,” he said.

“It’s not where I’d like to end up, in terms of banning them, of course, because there’s a great positive to social media too.

“But we run our country through parliament and we will and we must act if we have to.”

Mr Hancock’s intervention came after Molly Russell’s father said that Instagram helped kill his daughter.

The teenager was found dead in November 2017 and an inquest into her death is expected later this year.

In his letter to the social media companies, the health secretary said that he felt ”esperately concerned to ensure young people were protected.

“I welcome that you have already taken important steps, and developed some capabilities to remove harmful content," he wrote. "But I know you will agree that more action is urgently needed.

“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.

“It is time for internet and social media providers to step up and purge this content once and for all.”

Mr Hancock said that the government was preparing a white paper to examining online dangers, which would consider content on sucide and self-harm.

Facebook executive Sheryl Sandberg: social media company would be 'open to regulation'

“I want to work with internet and social media providers to ensure the action is as effective as possible," he said.

"However, let me be clear that we will introduce new legislation where needed.”

The Samaritans praised the health secretary, saying that he had taken a positive step in contacting the technology companies and talking about how social media platforms could do more to protect users from harmful content.

“While there are lots of positive peer support communities on social channels, we need to maximise opportunities to see positive content and minimise potential for seeing more harmful content," a spokesperson for the charity said.

“Tech companies could do more to show users how to flag dangerous content and should work together to remove dangerous imagery across their platforms. There is also a need for more research to into this issue, which should by carried out by these companies."

However, the spokesperson said the charity knew that for a lot of people, social media and the online environment more generally provided them with an important space to share their feelings and get support.

"It is really important that this continues, and we’d like companies to do more to ensure that people searching for self-harm and suicide content are able to access supportive content more easily,” they said.

It is understood that Instagram is taking steps to reduce the amount of harmful content on the platform and will inform Mr Hancock of its plans.

The site’s approach will include blocking content from appearing on hashtag searches, where the hashtag is being used to share significant amounts of harmful material.

“Our thoughts go out to Molly’s family and anyone dealing with the issues raised in this report," a spokesperson for Instagram said.

"Nothing is more important to us than the safety of the people in our community, and we work with experts everyday to best understand the ways to keep them safe.

“We do not allow content that promotes or encourages eating disorders, self-harm or suicide and use technology to find and remove it.

"Mental health and self-harm are complex and nuanced issues, and we work with expert groups who advise us on our approach.

Support free-thinking journalism and attend Independent events

"They tell us that the sharing of a person’s mental health struggle or connecting with others who have battled similar issues, can be an important part of recovery.

"This is why we don’t remove certain content and instead offer people looking at, or posting it, support messaging that directs them to groups that can help.

“We are undertaking a full review of our enforcement policies and technologies around self-harm, suicide and eating disorders. As part of this, we are consulting further with mental health bodies and academics to understand what more we can do to protect and support our community, especially young people.

“While we undertake this review, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags.”

For confidential support in the UK, contact the Samaritans on 116 123. In the US, contact the Suicide Prevention Lifeline on1-800-273-8255.

Additional reporting by agencies

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in