Google deep learning capabilities heading to Android, creating phones that can think like people

The company has announced a tie-up with a chip maker that creates computers that can see in the same way humans can

Your next Android phone might be able to see like a real human being.

Google has announced that it is to integrated deep learning into its phone operating system, allowing the phones to use algorithms to recognise what is in pictures and think about it like a person.

The company has begun a tie-up with Movidius, a company that makes chips that help with “machine vision”. The two companies have already worked together on Google’s Project Tango, which uses a series of cameras to allow computers to be able to see spaces in 3D.

Now, similar technology could be on its way to Android phones.

The new features could allow the phones to tell what they’re looking at without relying on the internet, according to reports. It might also allow the phone to tell what street signs or words it is looking at, and then feed that data to the phone so that it can process it.

It might even be used for completely new devices and phones, of the kind not before seen, according to Google.

"Our collaboration with Movidius is enabling new categories of products to be built that people haven't seen before," the head of Google’s machine intelligence group, Blaise Agϋera y Arcas, told Computer World.

Deep learning uses computers that are structured something like the human brain, allowing them to recognise and understand things in the same way that we do. Google already uses the technology in some of its products such as Google Photos, which allows people to search for specific objects or people in their pictures.