Title
Robust Real-World Image Super-Resolution against Adversarial Attacks
Abstract
ABSTRACTRecently deep neural networks (DNNs) have achieved significant success in real-world image super-resolution (SR). However, adversarial image samples with quasi-imperceptible noises could threaten deep learning SR models. In this paper, we propose a robust deep learning framework for real-world SR that randomly erases potential adversarial noises in the frequency domain of input images or features. The rationale is that on the SR task clean images or features have a different pattern from the attacked ones in the frequency domain. Observing that existing adversarial attacks usually add high-frequency noises to input images, we introduce a novel random frequency mask module that blocks out high-frequency components possibly containing the harmful perturbations in a stochastic manner. Since the frequency masking may not only destroys the adversarial perturbations but also affects the sharp details in a clean image, we further develop an adversarial sample classifier based on the frequency domain of images to determine if applying the proposed mask module. Based on the above ideas, we devise a novel real-world image SR framework that combines the proposed frequency mask modules and the proposed adversarial classifier with an existing super-resolution backbone network. Experiments show that our proposed method is more insensitive to adversarial attacks and presents more stable SR results than existing models and defenses.
Year
DOI
Venue
2021
10.1145/3474085.3475627
International Multimedia Conference
DocType
ISSN
Citations 
Conference
Proceedings of the 29th ACM International Conference on Multimedia (2021) 5148-5157
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Jiutao Yue100.34
Hao-Feng Li283.15
Pengxu Wei322.45
Guanbin Li425937.61
Liang Lin500.34