Defending Against Physical Adversarial Patch Attacks on Infrared Human Detection (2309.15519v3)
Abstract: Infrared detection is an emerging technique for safety-critical tasks owing to its remarkable anti-interference capability. However, recent studies have revealed that it is vulnerable to physically-realizable adversarial patches, posing risks in its real-world applications. To address this problem, we are the first to investigate defense strategies against adversarial patch attacks on infrared detection, especially human detection. We propose a straightforward defense strategy, patch-based occlusion-aware detection (POD), which efficiently augments training samples with random patches and subsequently detects them. POD not only robustly detects people but also identifies adversarial patch locations. Surprisingly, while being extremely computationally efficient, POD easily generalizes to state-of-the-art adversarial patch attacks that are unseen during training. Furthermore, POD improves detection precision even in a clean (i.e., no-attack) situation due to the data augmentation effect. Our evaluation demonstrates that POD is robust to adversarial patches of various shapes and sizes. The effectiveness of our baseline approach is shown to be a viable defense mechanism for real-world infrared human detection systems, paving the way for exploring future research directions.
- “Intriguing properties of neural networks,” in ICLR, 2014.
- “Explaining and harnessing adversarial examples,” in ICLR, 2015.
- “Adversarial examples in the physical world,” in Artificial intelligence safety and security, pp. 99–112. Chapman and Hall/CRC, 2018.
- “Adversarial patch,” arXiv preprint arXiv:1712.09665, 2017.
- “Physical adversarial examples for object detectors,” in 12th USENIX workshop on offensive technologies (WOOT 18), 2018.
- “Robust physical-world attacks on deep learning visual classification,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 1625–1634.
- “On physical adversarial patches for object detection,” arXiv preprint arXiv:1906.11897, 2019.
- “Fooling automated surveillance cameras: adversarial patches to attack person detection,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops, 2019, pp. 0–0.
- “Fooling thermal infrared pedestrian detectors in real world using small bulbs,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2021, vol. 35, pp. 3616–3624.
- “Infrared invisible clothing: Hiding from infrared detectors at multiple angles in real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 13317–13326.
- “Hotcold block: Fooling thermal infrared detectors with a novel wearable design,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2023, vol. 37, pp. 15233–15241.
- “Physically adversarial infrared patches with learnable shapes and locations,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 12334–12342.
- “Pedestrian detection using infrared images and histograms of oriented gradients,” in 2006 IEEE Intelligent Vehicles Symposium. IEEE, 2006, pp. 206–212.
- “Explaining and harnessing adversarial examples,” arXiv preprint arXiv:1412.6572, 2014.
- “Adversarial yolo: Defense human detection patch attacks via detecting adversarial patches,” arXiv preprint arXiv:2103.08860, 2021.
- “Trivialaugment: Tuning-free yet state-of-the-art data augmentation,” 2021.
- “Free - flir thermal dataset for algorithm training — teledyne flir,” .
- “Pedestrian detection at day/night time with visible and fir cameras: A comparison,” Sensors, vol. 16, no. 6, 2016.
- Glenn Jocher et. al., “ultralytics/yolov5: v6.0 - YOLOv5n ’Nano’ models, Roboflow integration, TensorFlow export, OpenCV DNN support,” Oct. 2021.
- “Microsoft coco: Common objects in context,” in Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part V 13. Springer, 2014, pp. 740–755.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.