Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

New iPhone to improve facial recognition feature so it can see owner better, report claims

Latest technology could add other features to 2019 phone, like 3D scans of people's face

Andrew Griffin
Monday 05 November 2018 13:57 GMT
Comments
A customer sets up Face ID on his new iPhone X at the Apple Store Union Square on November 3, 2017, in San Francisco, California
A customer sets up Face ID on his new iPhone X at the Apple Store Union Square on November 3, 2017, in San Francisco, California (ELIJAH NOUVELAGE/AFP/Getty Images)

Apple's next iPhone could bring important updates to its flagship feature, according to a new rumour.

The phone could vastly improve the Face ID facial recognition that sits in the top of the handset.

New technology will allow the invisible lights that are used as part of the system to illuminate people's face far better, allowing it to recognise its owners more quickly, according to a report from reliable Apple analyst Ming-chi Kuo.

Face ID was first introduced in the iPhone X, last year. It works by throwing out an array of 30,000 infrared lights, which hit the holder's face – a sensor in the camera can then see where they fall and check whether the right person is holding the phone.

But when the light is especially bright – on a sunny day, for instance – those invisible lights within the phone can get thrown off and the facial recognition sensor can have trouble seeing its owner.

The new feature will improve that by better illuminating people's faces. The new sensor and its better lights will "lower the impacts from visible lights of environment in order to improve the Face ID user experience", according to the report first spotted by Macrumors.

In addition to those new features, the updated version of the Face ID sensor could also allow it to use "time of flight" calculations to make 3D models of the things it is looking at. That technology works by throwing out a piece of light and seeing how long it takes to bounce back, and then using that to work out the distance – similar to something like radar.

If Apple were to integrate that, it would allow the phone to use its modelling software to create augmented reality features to improve tools like the Animoji feature, for instance.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in