“I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list,” Vishal Shah, Instagram’s vice president of product, wrote on an internal employee message board, according to BuzzFeed News.
“We will be building a new youth pillar within the Community Product Group to focus on two things: (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time.”
The new app would be overseen by Adam Mosseri, the head of Instagram, and Pavni Diwanji who was previously the head of Google’s child-friendly version of YouTube, called YouTube Kids.
Mosseri told BuzzFeed that the company knows “more and more kids” want to use Instagram but that age verification is a challenge for the company. Instagram apparently does not have a “detailed plan” for the development of its child-focused Instagram project.
“We have to do a lot here,” he said, “but part of the solution is to create a version of Instagram for young people or kids where parents have transparency or control. It’s one of the things we’re exploring.”
- PS5 pre-order stock arrives at Tesco – and then immediately disappears
- What is NFT and why is it being used for art?
- Netflix crackdown on password sharing, explained
- Scientists believe they know where mysterious ‘alien spacecraft’ comet actually came from
- Nasa releases mysterious Mars audio of ‘high-pitched scratching noise’ recorded by Perseverance
This news comes after Instagram announced new tools to stop adults and children interacting on the platform, such as using artificial intelligence to mark ‘suspicious’ behaviour on the app and encouraging teenagers to use private accounts.
Instagram has historically struggled to stop sexual abusers on its platform, and paedophiles have increasingly used the social media app to target children, the NSPCC said in 2019.
A recent report from cloud storage company pCloud also called Instagram the ‘most invasive app’, reportedly collecting 79 per cent of its users’ personal data to share with third parties, including search history, location, contacts and financial info.
However, the ‘kids’ versions of social media apps have repeatedly failed to protect children. YouTube Kids has historically had to disable comments on the platform to avoid paedophiles using it while self-harm clips were not picked up by company filters and were intercut into children’s content.
In 2019, a bug in Facebook’s Messenger Kids platform meant children could join groups with strangers, despite the company’s apparently strong privacy controls.
Facebook claimed that it consulted experts in developing the app; Wired reported that the company had a financial relationship with many of the people that had advised it.
“Increasingly kids are asking their parents if they can join apps that help them keep up with their friends. Right now there aren’t many options for parents, so we’re working on building additional products that are suitable for kids, managed by parents. We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more,” a Facebook company spokesperson told The Independent in a statement, but did verify the accuracy of BuzzFeed’s reporting.
Following the development of Messenger Kids, over 95 children’s health groups sent a letter to Mark Zuckerberg asking him to discontinue the product.
“Excessive use of digital devices and social media is harmful to children and teens, making it very likely this new app will undermine children’s healthy development,” the group said.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies