AdvGPS: Adversarial GPS for Multi-Agent Perception Attack
Jinlong Li1Baolu Li1Xinyu Liu1 Jianwu Fang2Qing Guo3Felix Juefei-Xu4Hongkai Yu1
1 Cleveland State University, 2 Xi'an Jiaotong University, 3 New York University, 4 A*STAR
Illustration of AdvGPS for multi-agent perception attack. Here we use Vehicle-to-Vehicle (V2V) cooperative perception in autonomous driving as an example. Ego vehicle might receive the shared visual information from other CAVs with the adversarial GPS signal, leading to significant false negative and false-positive detection errors.
Overview
In this work,we propose the first research of adversarial GPS signals which are also stealthy for the V2V cooperative perception attacks, denoted as AdvGPS. We propose three statistically sensitive natural discrepancies in AdvGPS to enhance the multi-agent perception attack in the black-box scenarios, i.e., appearance-based discrepancy, distribution-based discrepancy, and taskaware discrepancy. The experimental results on the publicized OPV2V dataset demonstrate that our AdvGPS attacks substantially undermine the performance of state-of-the-art methods and show outstanding transferability across different point cloud based 3D detection systems.
Quantitative Results of GPS Attack
3D detection performance on V2V Culver City testing set of OPV2V dataset. We show the Average Precision (AP) at IoU=0.5. The best and second best attacked performance among five state-of-the-art cooperative perception methods are respectively highlighted using red and blue color.
Visualization
3D detection visualization on attacking V2V model CoBEVT.
BibTeX
  @inproceedings{li2024advgps,
    title={AdvGPS: Adversarial GPS for Multi-Agent Perception Attack},
    author={Li, Jinlong and Li, Baolu and Liu, Xinyu and Fang, Jianwu and Juefei-Xu, Felix and Guo, Qing and Yu, Hongkai},
    booktitle={2024 International Conference on Robotics and Automation (ICRA)},
    pages={},
    year={2024},
    organization={IEEE}
  }