A Tesla Model S P100D in Ludicrous Mode during a speed test in Las Vegas, Nevada, on 3 January, 2017
A Tesla Model S P100D in Ludicrous Mode during a speed test in Las Vegas, Nevada, on 3 January, 2017

Hacked billboards could trick self-driving cars into suddenly stopping

‘A Tesla will apply the brakes or possibly swerve, and that’s dangerous,' security researcher warns

Anthony Cuthbertson
Thursday 15 October 2020 13:56
Comments

Self-driving cars could be tricked into suddenly stopping  or dangerously swerving if presented with a “phantom object” on a billboard or other digital display, researchers have warned.

A demonstration by security researchers at Israel’s Ben Gurion University of the Negev showed that a hijacked billboard showing an image of a stop sign for just a fraction of a second would be enough to trigger the automatic brakes of an autonomous car.

Automated driving systems could also be confused by an image of a pedestrian flashing in front the car.

“The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous,” researcher Yisroel Mirsky told Wired, which first reported the vulnerability.

“The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why.”

The researchers created a scenario where attackers inject fake road signs into an internet-enabled billboard by hacking it – a technique that has been used by hackers in the past. An incident in 2017, for example, saw a road sign in California display the message “Trump has herpes.” 

The phantom objects inserted by the researchers were able to fool a Tesla Model X running the most recent version of Tesla’s Autopilot software. 

They used a Mcdonald’s advert playing on a television-sized billboard for the test, adding a Stop sign over the ad for just 0.43 seconds. 

A research paper detailing the attack method revealed that it also worked on the Mobileye 630 Pro self-driving system.

The researchers proposed that countermeasures could be taken, such as introducing software that recognises when a flashed speed sign is a phantom object. 

Tesla’s Autopilot system is known to occasionally misinterpret speed signs placed on the back of school buses. Earlier this week, a Tesla owner alerted the company’s CEO Elon Musk to the issue on Twitter. 

“This is getting annoying now,” they tweeted, together with an image of their Tesla identifying the bus as a speed sign. 

Mr Musk replied: “We face a tough dichotomy of applying resources to the old architecture or applying them to the new. It’s not a question of money. If there was a ‘great engineer’ factory, we would place a large order! Unfortunately, great engineers are very rare." 

The researchers of the latest study said they presented their findings to Tesla and Mobileye.

The Independent has reached out to Mobileye for comment. A spokesperson for Tesla was not available.

Register for free to continue reading

Registration is a free and easy way to support our truly independent journalism

By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists

Already have an account? sign in

By clicking ‘Register’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in