Children as young as five years old are being targeted for grooming on Instagram where attempts have more than tripled, the NPCC has warned.
More than 5,100 online grooming crimes were recorded by police in just 18 months after a new offence of sexual communication with a child came into force, figures show.
In cases where officers recorded how victims were contacted, Facebook, Snapchat and Instagram were used 70 per cent of the time, according to the data obtained by the NSPCC, with Instagram accounting for 33 per cent.
The charity’s chief executive, Peter Wanless, accused social media firms of “10 years of failed self-regulation”.
“These figures are overwhelming evidence that keeping children safe cannot be left to social networks,” he said.
He added: “It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people.”
The data runs from April 2017, when the law was changed, and September 2018 and was obtained through freedom of information requests to 39 of the 43 police forces in England and Wales.
In most instances, police forces did not record which particular website or app was used to groom the victim. But where they did, a steep increase in the use of Instagram was observed.
In the first six months since the law came into force, from April to September 2017, there were 126 recorded instances of Instagram being used to sexually groom a child.
Just one year later during the same time period, that number rose to 428, a 240-per-cent increase.
The Independent has contacted Instagram for comment.
According to the NSPCC data, the most common target of online groomers were girls aged 12 to 15.
One in five victims, however, were aged under 11. Children as young as five were recorded as victims in some instances.
The government is due to publish a white paper on internet safety before the end of winter and Mr Wanless said it was vital it included tough new regulation.
The NSPCC is campaigning for tech firms to be given a legal duty of care to children who use their platforms and for large fines to be imposed on them when they fail to protect under-18s.
One mother of a 13-year-old girl who was groomed by a 24-year-old man over Facebook and Snapchat said if social media had not existed her daughter would have been spared her ordeal.
“We felt as though we had failed as parents – we knew about these social media sites, we thought we were doing everything we could to ensure our children’s safety when they were online, but we still couldn’t protect her.
“Somebody has got to take responsibility for what happens to children on their platforms. Simply put, if social media didn’t exist, this would never have happened to her.”
The white paper on internet safety was originally meant to have been published by the end of 2018, although that deadline later slipped to the end of the winter.
In February, a spokesperson for the Department for Digital, Culture, Media and Sport, said it had “heard” demands for an internet regulator and statutory duty of care and was “seriously considering all options”.
A National Crime Agency spokesperson said: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse, including online grooming.
“Children and young people also need easy access to mechanisms allowing them to alert platforms to potential offending.
“The National Crime Agency helps industry to enhance their reporting tools and where possible, shares knowledge and expertise to support industry to improve standards and security online.”