Social media companies could be forced by law to remove illegal content and sign a code of conduct protecting vulnerable users.
The proposed crackdown will be announced by digital minister Margot James, the Daily Mail reported. Her speech comes after the death of Molly Russell, 14, whose family found she had viewed content on social media linked to anxiety, depression, self-harm and suicide before taking her own life in November 2017.
Ms James will make a speech at a conference for Safer Internet Day, saying: "The tragic death of Molly Russell is the latest consequence of a social media world that behaves as if it is above the law. There is far too much bullying, abuse, misinformation as well as serious and organised crime online. For too long the response from many of the large platforms has fallen short.
"We are working towards the publication of the final policy paper, and consultation, before bringing in a new regulatory regime. We will introduce laws that force social media platforms to remove illegal content, and to prioritise the protection of users beyond their commercial interests."
A Department for Digital, Culture, Media and Sport spokesman said: "We have heard calls for an internet regulator and to place a statutory duty of care on platforms, and are seriously considering all options. Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people. Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not."
She is expected to call on social media companies to take action to protect users from the impact of harmful content at a conference in London.
Ms Doyle-Price will say: "We must look at the impact of harmful suicide and self-harm content online... in normalising it, it has an effect akin to grooming. We have embraced the liberal nature of social media platforms, but we need to protect ourselves and our children from the harm which can be caused by both content and behaviour."
Ms Doyle-Price is expected to tell the National Suicide Prevention Alliance Conference that internet and social media providers must "step up to their responsibilities to protect their users", while the Government considers tougher regulation.
The minister's comments come ahead of a meeting with Facebook to discuss what action the company is taking to curb harmful online content.
Ms Doyle-Price will echo Health Secretary Matt Hancock's recent warning that the Government is prepared to introduce new legislation "where needed" to tackle the issue.
"If companies cannot behave responsibly and protect their users, we will legislate," she will say. "Providers ought to want to do this. They shouldn't wait for Government to tell them what to do. It says a lot about the values of companies if they do not take action voluntarily."
Ms Doyle-Price will highlight the "danger" created by smartphones that have "revolutionised our world".
"We are becoming addicted to our screens," she will say. "We look around restaurants and see people staring at their screens rather than engaging with each other. We need to learn much better behaviours. Unless we do, we are at risk of becoming over-stimulated. We can lose perspective and become intensely over-engaged with our WhatsApp or Facebook groups. This brings real risk."
In January, Mr Hancock called on internet giants to "purge" the internet of content that promotes self-harm and suicide, following the death of teenager Molly Russell.
The 14-year-old's family found she had viewed content on social media linked to anxiety, depression, self-harm and suicide before taking her own life in November 2017.
Her father Ian Russell said he had "no doubt Instagram helped kill my daughter".
Mr Hancock is due to meet Instagram officials to understand how it is tackling harmful online content.
Instagram boss Adam Mosseri said he was "deeply moved" by Molly's story and that the social media platform is "not yet where we need to be" on the issues of suicide and self-harm.
Mr Mosseri said the Facebook-owned platform had launched a comprehensive review into its policies and was adding "sensitivity screens" to images of self-harm as part of plans to make posts on the subject harder to find.
The Department for Culture, Media and Sport (DCMS) and the Home Office are due to publish a white paper on the Government's approach to online safety in the winter.
A DCMS spokesman said: "We have heard calls for an internet regulator and to place a statutory duty of care on platforms, and are seriously considering all options. Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people.
"Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not."
Culture Secretary Jeremy Wright has said the Government is "considering very carefully" calls to subject companies to a legal duty of care.
NHS England chief executive Simon Stevens has proposed the introduction of a mental health levy on social media firms.
Reports by the Commons Science and Technology Committee and the Children's Commissioner for England have called on social media firms to take more responsibility for the content on their platforms.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies