Detector Collapse: Backdooring Object Detection to Catastrophic Overload or Blindness in the Physical World
Detector Collapse: Backdooring Object Detection to Catastrophic Overload or Blindness in the Physical World
Hangtao Zhang, Shengshan Hu, Yichen Wang, Leo Yu Zhang, Ziqi Zhou, Xianlong Wang, Yanjun Zhang, Chao Chen
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 1670-1678.
https://doi.org/10.24963/ijcai.2024/185
Object detection tasks, crucial in safety-critical systems like autonomous driving, focus on pinpointing object locations. These detectors are known to be susceptible to backdoor attacks. However, existing backdoor techniques have primarily been adapted from classification tasks, overlooking deeper vulnerabilities specific to object detection. This paper is dedicated to bridging this gap by introducing Detector Collapse (DC), a brand-new backdoor attack paradigm tailored for object detection. DC is designed to instantly incapacitate detectors (i.e., severely impairing detector's performance and culminating in a denial-of-service). To this end, we develop two innovative attack schemes: Sponge for triggering widespread misidentifications and Blinding for rendering objects invisible. Remarkably, we introduce a novel poisoning strategy exploiting natural objects, enabling DC to act as a practical backdoor in real-world environments. Our experiments on different detectors across several benchmarks show a significant improvement (~10%-60% absolute and ~2-7x relative) in attack efficacy over state-of-the-art attacks.
Keywords:
Computer Vision: CV: Recognition (object detection, categorization)
AI Ethics, Trust, Fairness: ETF: Trustworthy AI
AI Ethics, Trust, Fairness: ETF: Safety and robustness