WhatsApp head criticises Apple’s plan to scan photos for child abuse: ‘Setback for people’s privacy’

‘I think this is the wrong approach and a setback for people’s privacy all over the world,’ WhatsApp CEO says

Bevan Hurley
Saturday 07 August 2021 15:24 BST
Comments
Facebook moderator left with PTSD after viewing graphic content everyday

A bitter feud over privacy between tech giants Facebook and Apple has ramped up over plans by the iPhone maker to launch a new photo identification programme to identify child abuse images.

Will Cathcart, the head of Facebook-owned WhatsApp, said Apple’s proposal to introduce software that can scan private photos on its iOS system was a clear privacy violation.

Declaring that WhatsApp would not allow Apple to run the tools on its platform, Mr Cathcart said  their approach “introduces something very concerning into the world”.

“I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no," Mr Cathcart posted in a lengthy Twitter thread on Friday.

“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone - even photos you haven’t shared with anyone. That’s not privacy.”

Apple announced on Thursday it was launching new features that can detect signs of abuse in photos and messages.

The company is introducing three new measures, which take in messages, photos, and additional features, that will be added to Siri.

The additions are coming “later this year”, Apple said, in updates that will come to all of its platforms: iOS 15, iPadOS 15, WatchOS 8 and MacOS Monterey.

The features will initially be limited to the US.

If a child receives an image that appears sexually explicit, it will be blurred and the child will receive a warning.

If the child decides to view a message, they will be told that their parents will be alerted, and an adult will then be notified.

Mr Cathcart said the surveillance system could be used to scan private content for anything they or a government wanted to monitor.

“Countries where iPhones are sold will have different definitions on what is acceptable,” he added.

Apple has said that other child safety groups were likely to be added as sources as the programme expands.

“Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?” asked Mr Cathcart.

Apple CEO Tim Cook and Facebook founder Mark Zuckerburg have previously clashed over privacy.

In April, Apple announced it was releasing an update for its iPhones and iPads which required app developers to explicitly request permission to track users’ behaviour on the internet.

Facebook, whose business model relies on it being able to sell users’ personal information to advertisers, hit out at the move, saying limiting personalised ads would “take away a vital growth engine for businesses”.

In an interview with the New York Times that month, Mr Cook dismissed their concerns.

“I think that you can do digital advertising and make money from digital advertising without tracking people when they don’t know they’re being tracked,” he said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in