The laser ranging (LIDAR) systems that most self-driving cars rely on to sense obstacles can be hacked by a setup costing just US$60, a security researcher has told 6781 IEEE spectrum.
According to Jonathan Petit, principal scientist at software security company Security Innovation, he can take echoes of a fake car, pedestrian or wall and put them in any location. Using such a system, which he designed using a low-power laser and pulse generator, attackers could trick a self-driving car into thinking something is directly ahead of it, forcing it to slow down.
In a paper written while he was a research fellow in the University of Cork’s Computer Security Group and due to be presented at the Black Hat Europe security conference in November, Petit describes the system he built with off the shelf components that can create the illusion of an obstacle anywhere from 20 to 350 metres from the LIDAR unit and make multiple copies of the simulated obstacles, and even make them move.
While the short-range radars used by many self-driving cars for navigation operate in a frequency band requiring licencing, LIDAR systems use easily-mimicked pulses of laser light to build up a 3-D picture of the car’s surroundings and were ripe for attack.
“I can spoof thousands of objects and basically carry out a denial of service attack on the tracking system so it’s not able to track real objects,” Petit told IEEE spectrum. I don’t think any of the LIDAR manufacturers have thought about this or tried this.”
According to Jonathan Petit, principal scientist at software security company Security Innovation, he can take echoes of a fake car, pedestrian or wall and put them in any location. Using such a system, which he designed using a low-power laser and pulse generator, attackers could trick a self-driving car into thinking something is directly ahead of it, forcing it to slow down.
In a paper written while he was a research fellow in the University of Cork’s Computer Security Group and due to be presented at the Black Hat Europe security conference in November, Petit describes the system he built with off the shelf components that can create the illusion of an obstacle anywhere from 20 to 350 metres from the LIDAR unit and make multiple copies of the simulated obstacles, and even make them move.
While the short-range radars used by many self-driving cars for navigation operate in a frequency band requiring licencing, LIDAR systems use easily-mimicked pulses of laser light to build up a 3-D picture of the car’s surroundings and were ripe for attack.
“I can spoof thousands of objects and basically carry out a denial of service attack on the tracking system so it’s not able to track real objects,” Petit told IEEE spectrum. I don’t think any of the LIDAR manufacturers have thought about this or tried this.”