Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Google's self-driving cars are now tackling street driving, but are they too wimpy?

 

James Vincent
Tuesday 29 April 2014 13:13 BST
Comments

Google has released the latest update to its self-driving car project, showing how the vehicles have been busy learning to navigate not just motorways, but chaotic city streets.

The video below shows the company’s driverless car navigating a range of different hazards, including construction works, blocked lanes and railway crossings.

“As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer,” wrote Chris Urmson, the director of the project, in a blog post on Monday.

“As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it).”

The technology giant say that their vehicles have now logged nearly 700,000 autonomous miles since 2009, with the only two recorded accidents directly caused by humans (in one a driverless car was rear-ended at a stop light in the other a human had taken control of the vehicle).

However, the latest video by Google highlights another potential problem: what if self-driving cars are too timid?

In the city driving demonstration Google’s vehicle is cautious to a fault, in one scenario detecting when a cyclist makes a hand signal to change lanes (a big step forward) and continuing to yield “even when [the cyclist] changes his mind multiple times".

Now, obviously a human driver would do the same – not knowing whether the cyclist was having some sort of trouble or just being an idiot – but it does suggest that self-driving cars might lack some of the intuition necessary for realistic driving.

Being extremely careful is certainly a point in self-driving cars’ favour, but it could cause problems – from bored kids playing chicken with robot drivers to humans taking advantage of the cars deferential treatment when, say, pulling out in traffic.

There’ll always be a human in the car to take the wheel (or beep the horn) but if Google really wants to build, as it says, “a vehicle that operates fully without human intervention" will it need to give its algorithms a bit of aggression, as well as prudence?

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in