Title
Information Distribution Based Defense Against Physical Attacks on Object Detection
Abstract
Recently, physical attacks launch a new challenge to the security of deep neural networks (DNNs) by generating physical-world adversarial patches to attack DNNs based applications. The information distribution contained in the adversarial patch is different from that in the real image patches. In this paper, we propose a general defense method to effectively prevent such attacks. This method consists of an entropy-based proposal component and a gradient-based filtering component. Each component of our method can be viewed as preprocessing of adversarial images. Processed images are then run through the unmodified detectors, making our method agnostic to both the detectors and the attacks. Moreover, our method is based on traditional image processing rather than DNNs, so it does not require a great quantity of training data. Extensive experiments on different datasets indicate that our method is able to defend against physical attack on object detection effectively, increasing mAP from 31.3% to 53.8% for Pascal VOC 2007 and from 19.0% to 40.3% for Inria, and has better transferability, which can defend against different physical attacks.
Year
DOI
Venue
2020
10.1109/ICMEW46912.2020.9105983
2020 IEEE International Conference on Multimedia & Expo Workshops (ICMEW)
Keywords
DocType
ISSN
Adversarial examples,physical attacks,adversarial patch,adversarial defense,object detection
Conference
2330-7927
ISBN
Citations 
PageRank 
978-1-7281-1486-6
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Guangzhi Zhou100.34
Hongchao Gao202.70
Peng Chen301.01
Jin Liu401.35
Jiao Dai5263.72
Jizhong Han635554.72
Ruixuan Li740569.47