iPhone 11: New Apple handset will feature augmented reality cameras, suggest first clear leaks

The company is making AR a central part of its entire strategy

Andrew Griffin
Tuesday 14 November 2017 16:59 GMT
An attendee looks at a new iPhone X during an Apple special event at the Steve Jobs Theatre on the Apple Park campus on September 12, 2017 in Cupertino, California
An attendee looks at a new iPhone X during an Apple special event at the Steve Jobs Theatre on the Apple Park campus on September 12, 2017 in Cupertino, California

Apple is working on augmented reality cameras for the back of its next phones, according to new leaks.

The rumours come just days after Apple released the iPhone X, the phone it heralded as the future. And it suggests that branding might be correct, with the rumoured model taking plenty of cues from the newly released phone.

Just like the current phone's front-facing, selfie camera can detect the depth of what it's looking at, the new phone will be able to do the same out of its back, the new leaks reported by Bloomberg suggest. And in turn that could be used for a wide variety of augmented reality features, of the kind that Apple suggests is the future of the iPhone.

Bloomberg reported that the new features will be coming to Apple's 2019 phone. That means they will presumably be the model after the next one – and given Apple's now complicated naming system for the iPhone, it's not clear what number those phones will be known by, or whether they'll have a number at all.

Apple's front-facing TrueDepth sensor is one of the biggest features of the new iPhone X. It's used for a range of features, including the facial recognition technology that unlocks the phone and the animojis that are becoming one of its most popular features.

But it won't be possible to simply add that technology into the back as well as the front of the new phone. On the iPhone X, the selfie camera works by sending out infrared light and watching for how it bounces off the face – but that wouldn't work over the longer distances and more complicated objects that the back camera would need to see.

Instead, the 2018 phone will use a laser that will bounce off objects, and the phone will watch for how long that process takes so that it can map out the entire room.

Apple already allows people to map out their rooms for augmented reality on their phones. But at the moment it's mostly done with software, by pointing the camera around the room and allowing to to recognise the objects and surfaces it needs to see.

Apple and its boss Tim Cook have repeatedly stressed that they see augmented reality as the future not just of the iPhone but of computing in general. Mr Cook has compared the technology to the original Apple Store, and suggested that its effects will spread across the entirety of the Apple line and change everything people do with its phones and computers.

By rolling out those features now, it is allowing developers to create augmented reality experiences and letting customers get used to it. Eventually, it will be able to use the same technologies for more advanced capabilities – including the potential development of AR glasses that would let people see the virtual objects without using their phone, Tim Cook has told The Independent.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies


Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in