Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Google Lens uses AI to understand the world better than humans can

The company is doubling down on machine-learning

Aatif Sulleyman
Wednesday 17 May 2017 17:56 BST
Comments
I/O is Google's annual developer conference
I/O is Google's annual developer conference (Getty)

Google has announced a new app called Lens at its I/O developer conference.

It's not available yet, but is coming soon to Photos and the Google Assistant.

With Google Lens, you'll be able to point your phone at real-world objects around you and instantly view useful information about them.

Google demonstrated a handful of impressive examples at the show.

In one, Lens identified the name of a flower, which could be handy with summer around the corner.

In another, jaw-dropping demo, a user pointed Lens at a W-Fi router, and it automatically grabbed the username and password using optical character recognition.

It can also consider additional information, such as GPS location data, to work out where you are and bring up information for specific branches of restaurants and shops around you, including reviews and opening hours.

Google CEO Sundar Pichai has made a big deal about the company's shift from a "mobile-first" to an "AI-first" approach.

"We are re-thinking all of our products” using machine-learning, he said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in