Apple indefinitely delays introduction of photo scanning features after widespread outcry

Andrew Griffin
Friday 03 September 2021 21:19 BST
Comments
(AFP via Getty Images)
Leer en Español

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Apple has indefinitely delayed the introduction of its new anti-child abuse features, following widespread outcry from privacy and security campaigners.

The company had said that the two new tools – which attempt to detect when children are being sent inappropriate photos, and when people have child sexual abuse material on their devices – were necessary as a way to stop the grooming and exploitation of children.

But campaigners argued that they increased the privacy risks for other users of the phone. Critics said that the tools could be used to scan for other kinds of material, and that they undermined Apple’s public commitment to privacy as a human right.

Now Apple said that it will indefinitely delay those features, with a view to improving them before they are released.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” Apple said.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple had never given any specific indication of when the new features would be introduced. While it said that they would arrive with a version of iOS 15 – expected to be pushed out to iPhone and iPad users this month – it suggested that they might be introduced at some point after that initial launch.

Likewise, it gave no indication of how long the new consultation process would take, or whether it expected substantial changes to the system before it is released.

Apple announced the changes – which are made up of three new features – in early August. It said that it would add new information to Siri and search if people looked for child sexual abuse material (CSAM); that it would use the phone’s artificial intelligence to look at pictures sent to children and warn their parents if they appeared to be receiving inappropriate images; and that it would compare images uploaded to iCloud Photos with a database of known CSAM images, and alert authorities if they were found.

Apple stressed that all of the changes were intended to preserve privacy. It said that the scanning of photos happened purely on the device in order to preserve the end-to-end encryption of iMessages, and so that its servers were not involved in actual looking at the images as they were uploaded to iCloud.

The features gained approval from safeguarding groups, including the National Centre for Missing and Exploited Children, which worked with Apple and was to provide the database of abuse imagery that would be scanned through. The Internet Watch Foundation said it was a “vital step to make sure children are kept safe from predators and those who would exploit them online” and said that Apple’s system was a “promising step” both towards protecting privacy and keeping children safe.

But those assurances were not enough to satisfy security and privacy advocates. Edward Snowden said that Apple was “rolling out mass surveillance to the entire world”, and the Electronic Frontier Foundation said the feature could easily be broadened to search for other kinds of material.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses,” it said in a statement shortly after the feature was released.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine-learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

In the wake of that outcry, Apple’s software chief Craig Federighi admitted that the announcement had been “jumbled pretty badly” but said that he and the company were still committed to the underlying technology.

Apple also looked to give more information about how exactly the feature worked, giving assurances including a commitment to be transparent with security researchers and saying that it would set the threshold for CSAM sufficiently high that it did not expect the system to show false positives.

But the opposition continued, and critics continued to call on Apple to drop the feature. In mid-August, a coalition of more than 90 different activist groups wrote an open letter to Apple’s chief executive, Tim Cook, asking him to abandon what it called a plan to “build surveillance capabilities into iPhones, iPads and other Apple products”.

It warned that the feature in iMessages could put young people at risk by flagging images to their parents, noting especially that “LGBTQ+ youths with unsympathetic parents are particularly at risk”.

It also said that once the photo-scanning feature was built, “the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable”.

Apple said it would resist any attempts by governments to broaden the use of the features, and that they were only planned for use in the US initially.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in