Zephyrnet Logo

Spoofing LIDAR Could Blind Autonomous Vehicles to Obstacles

Date:

Humans manage to drive in an acceptable fashion using just two eyes and two ears to sense the world around them. Autonomous vehicles are kitted out with sensor packages altogether more complex. They typically rely on radar, lidar, ultrasonic sensors, or cameras all working in concert to detect the road conditions ahead.

While humans are pretty wily and difficult to fool, our robot driving friends are less robust. Some researchers are concerned that LiDAR sensors could be spoofed, hiding obstacles and tricking driverless cars into crashes, or worse.

Where Did It Go?

<img data-attachment-id="562656" data-permalink="https://hackaday.com/2022/11/22/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles/spoodger/" data-orig-file="https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-2.png" data-orig-size="968,566" data-comments-opened="1" data-image-meta="{"aperture":"0","credit":"","camera":"","caption":"","created_timestamp":"0","copyright":"","focal_length":"0","iso":"0","shutter_speed":"0","title":"","orientation":"0"}" data-image-title="spoodger" data-image-description data-image-caption="

fff

” data-medium-file=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-2.png?w=400″ data-large-file=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles.png” decoding=”async” loading=”lazy” class=”wp-image-562656 size-large” src=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles.png” alt width=”800″ height=”468″ srcset=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-2.png 968w, https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-2.png?resize=250,146 250w, https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-2.png?resize=400,234 400w, https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-2.png?resize=800,468 800w” sizes=”(max-width: 800px) 100vw, 800px”>

Using a laser to send false echoes back to a LiDAR sensor on an autonomous vehicle can be used to hide objects from its field of view. Credit: Research paper, Cao, Yulong and Bhupathiraju, S. Hrushikesh and Naghavi, Pirouz and Sugawara, Takeshi and Mao, Z. Morley and Rampazzi, Sara

LiDAR is so named as it is a light-based equivalent of radar technology. Unlike radar, though, it’s still typically treated as an acronym rather than a word in its own right. The technology sends out laser pulses and captures the light reflected back from the environment. Pulses returning from objects further away take longer to arrive back at the LiDAR sensor, allowing the sensor to determine the range of objects around it.  It’s typically considered the gold-standard sensor for autonomous driving purposes. This is due to its higher accuracy and reliability compared to radar for object detection in automotive environments. Plus, it offers highly-detailed depth data which is simply not available from a regular 2D camera.

A new research paper has demonstrated an adversarial method of tricking LiDAR sensors. The method uses a laser to selectively hide certain objects from being “seen” by the LiDAR sensor. The paper calls this a “Physical Removal Attack,” or PRA.

The theory of the attack relies on the way LiDAR sensors work. Typically, these sensors prioritize stronger reflection over weaker ones. This means that a powerful signal sent by an attacker will be prioritized over a weaker reflection from the environment. LiDAR sensors and the autonomous driving frameworks that sit atop them also typically discard detections below a certain minimum distance to the sensor. This is typically on the order from 50 mm to 1000 mm away.

The attack works by firing infrared laser pulses that mimic real echoes the LiDAR device is expecting to receive. The pulses are synchronised to match the firing time of the victim LiDAR sensor, in order to control the perceived location of spoofed points by the sensor. By firing bright laser pulses to imitate echoes at the sensor, the sensor will typically ignore the weaker real echoes picked up from an object in its field of view. This alone may be enough to hide the obstacle from the LiDAR sensor, but would seem to create a spoofed object very close to the sensor. However, since many LiDAR sensors discard excessively close echo returns, the sensor will likely discard them entirely. If the sensor doesn’t discard the data, the filtering software running on its point cloud output may do so itself. The resulting effect is the LiDAR will show no valid point cloud data in an area where it should be picking up an obstacle.

The attack requires some knowledge, but is surprisingly practical to achieve. One need only do some research to target various types of LiDAR used on autonomous vehicles to whip up a suitable spoofing apparatus. The attack works even if the attacker is firing false echoes towards the LiDAR from an angle, such as from the side of the road.

<img data-attachment-id="562655" data-permalink="https://hackaday.com/2022/11/22/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles/spudger/" data-orig-file="https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png" data-orig-size="781,997" data-comments-opened="1" data-image-meta="{"aperture":"0","credit":"","camera":"","caption":"","created_timestamp":"0","copyright":"","focal_length":"0","iso":"0","shutter_speed":"0","title":"","orientation":"0"}" data-image-title="spudger" data-image-description data-image-caption="

The top image shows the LiDAR scene under normal conditions. The bottom shot shows the scene with a Physical Removal Attack in progress. By

” data-medium-file=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png?w=313″ data-large-file=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png?w=490″ decoding=”async” loading=”lazy” class=”wp-image-562655 size-full” src=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png” alt width=”781″ height=”997″ srcset=”https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png 781w, https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png?resize=196,250 196w, https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png?resize=313,400 313w, https://zephyrnet.com/wp-content/uploads/2022/11/spoofing-lidar-could-blind-autonomous-vehicles-to-obstacles-1.png?resize=490,625 490w” sizes=”(max-width: 781px) 100vw, 781px”>

The top image shows the LiDAR scene under normal conditions. The bottom shot shows the scene with a Physical Removal Attack in progress. In a small segment of the LiDAR’s rotational travel, false echoes below the sensor’s minimum distance threshold are ignored. Thus, for a segment of the LiDAR’s rotation, no points are detected and the pedestrian in the road is hidden. Credit: Research paper, Cao, Yulong and Bhupathiraju, S. Hrushikesh and Naghavi, Pirouz and Sugawara, Takeshi and Mao, Z. Morley and Rampazzi, Sara

This has dangerous implications for autonomous driving systems relying on LiDAR sensor data. This technique could allow an adversary to hide obstacles from an autonomous car. Pedestrians at a crosswalk could be hidden from LiDAR, as could stopped cars at a traffic light. If the autonomous car does not “see” an obstacle ahead, it may go ahead and drive through – or into – it. With this technique, it’s harder to hide closer objects than those that are farther away. However, hiding an object even for a few seconds might leave an autonomous vehicle with too little time to stop when it finally detects a hidden obstacle.

Outside of erasing objects from a LiDAR’s view, other spoofing attacks are possible too. Earlier work by researchers has involved tricking LiDAR sensors into seeing phantom objects. This is remarkably simple to achieve – one only need transmit laser pulses towards a victim LiDAR that indicate a wall or other obstacle ahead.

The research team note that there are some defences against this technique. The attack tends to carve out an angular slice from the LiDAR’s reported point cloud. Detecting this gap can indicate that a removal attack may be taking place. Alternatively, methods exist that involve comparing shadows to those expected to be cast by objects detected (or not) in the LiDAR point cloud.

Overall, protecting against spoofing attacks could become important as self-driving cars become more mainstream. At the same time, it’s important to contemplate what is and isn’t realistic to defend against. For example, human drivers are susceptible to crashing when their cars are hit with eggs or rocks thrown from an overpass. Automakers didn’t engineer advanced anti-rock lasers and super-wipers to clear egg smears. Instead, laws are enforced to discourage these attacks. It may simply be a matter of extending similar enforcement to bad actors running around with complicated laser gear on the side of the highway. In all likelihood, a certain amount of both approaches will be necessary.

spot_img

Latest Intelligence

spot_img