Google's self-driving cars have crashed 11 times in recent years — because humans keep driving into them, company claims

Company released data on the cars’ safety in response to a report that the cars had been involved in a number of crashes in California

Andrew Griffin
Tuesday 12 May 2015 11:26 BST
Comments
Google's Lexus RX 450H Self Driving Car
Google's Lexus RX 450H Self Driving Car

Google’s self driving have been involved in 11 crashes since they first hit the road six years ago — but a good number of those seem to have been caused by humans ploughing into the bank of them.

The company released data and a long explanation about the success of the cars, in response to a report from the Associated Press that the cars had been involved in three collisions since September. It said that the cars had been involved in “11 minor accidents (light damage, no injuries)” during “1.7 million miles of autonomous and manual driving with our safety drivers behind the wheel”.

The self-driving cars had not once been “the cause of the accident”, wrote Chris Urmson, the director of Google’s self-driving car programme. Urmson said that most of the crashes had been people driving into the back of the cars, “mainly at traffic lights but also on the freeway”, and some of the other crashes are thought to have happened while the car was being driven by a human.

Google hasn’t made any records of the crashes public, like other companies have, so there is no way of telling whether its claims about the crashes are accurate. Though companies report accidents in self-driving cars as part of the permits for testing them on the road, the department that looks after the reports can’t release them.

The self-driving car programme began in 2009. Google launched the cars promising that they would eventually be much more safe than their manual counterparts — since they are more aware of where other cars are, and can use that information to avert crashes.

Critics say that there should be more transparency about how many crashes they have been involved in. That will be key to ensuring that the public are convinced about their safety, they say.

Urmson acknowledged that the cars are likely to get hit, perhaps even when the cars are more widespread.

“Even when our software and sensors can detect a sticky situation and take action earlier and faster than an alert human driver, sometimes we won’t be able to overcome the realities of speed and distance; sometimes we’ll get hit just waiting for a light to change,” he wrote.

“And that’s important context for communities with self-driving cars on their streets; although we wish we could avoid all accidents, some will be unavoidable.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in