Skip to main navigation Skip to search Skip to main content

Towards Backdoor Attacks against LiDAR Object Detection in Autonomous Driving

  • University of Georgia
  • SUNY Buffalo

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

31 Scopus citations

Abstract

Due to the great advantage of LiDAR sensors in perceiving complex driving environments, LiDAR-based 3D object detection has recently drawn significant attention in autonomous driving. Although many advanced LiDAR object detection models have been developed, their designs are mainly based on deep learning approaches, which are usually data-hungry and expensive to train. Thus, it is common for some LiDAR perception system developers or self-driving car companies to collect training data from different sources (e.g., self-driving car users) or outsource the training work to a third party. However, these practices provide opportunities for backdoor attacks, where the attacker aims to inject a hidden trigger pattern into the victim detection model by poisoning its training set and let the model fail to detect objects when the trigger presents in the inference phase. Although backdoor attacks have posed serious security concerns, the vulnerability of LiDAR object detection to such attacks has not yet been studied. To fill the research gap, in this paper, we present the first study on backdoor attacks against LiDAR object detection in autonomous driving. Specifically, we propose a novel backdoor attack strategy based on which the attacker can achieve the attack goal by poisoning a small number of point cloud samples. In addition, the proposed attack strategy is physically realizable, and it allows the attacker to easily perform the attack using some common objects as the triggers. To make the poisoned samples difficult to be detected, we also design a stealthy attack strategy by creating some fake vehicle point clusters to hide the injected points in the point cloud. The desirable performance of our attacks is demonstrated through both simulation and real-world case study.

Original languageEnglish
Title of host publicationSenSys 2022 - Proceedings of the 20th ACM Conference on Embedded Networked Sensor Systems
PublisherAssociation for Computing Machinery, Inc
Pages533-547
Number of pages15
ISBN (Electronic)9781450398862
DOIs
StatePublished - Jan 24 2023
Event20th ACM Conference on Embedded Networked Sensor Systems, SenSys 2022 - Boston, United States
Duration: Nov 6 2022Nov 9 2022

Publication series

NameSenSys 2022 - Proceedings of the 20th ACM Conference on Embedded Networked Sensor Systems

Conference

Conference20th ACM Conference on Embedded Networked Sensor Systems, SenSys 2022
Country/TerritoryUnited States
CityBoston
Period11/6/2211/9/22

Keywords

  • LiDAR object detection
  • autonomous driving
  • backdoor attack

Fingerprint

Dive into the research topics of 'Towards Backdoor Attacks against LiDAR Object Detection in Autonomous Driving'. Together they form a unique fingerprint.

Cite this