Title
Attentive frequency learning network for super-resolution
Abstract
Benefiting from the strong capability of capturing long-range dependencies, a series of self-attention based single image super-resolution (SISR) methods have achieved promising performance. However, the existing self-attention mechanisms generally suffer great computational costs both in training and inference. In this study, we propose an innovative attentive frequency learning network (AFLN) for single image super-resolution. Our AFLN can greatly reduce computational costs of self-attention mechanism yet well capture long-range dependencies in SISR tasks. Specifically, our AFLN mainly consists of a series of extensive attentive frequency learning blocks (AFLB). In each AFLB, we firstly integrate the hierarchical features by residual dense connections and decompose the original features into low- and high-frequency domains with a half size of original features via discrete wavelet transform (DWT). Then, we adopt self-attention to explore long-range dependency relations in low- and high-frequency feature domains, respectively. In this way, we can model the self-attention in the quarter size of original input image, greatly reducing computational costs. In addition, the separating attention from low- and high-frequency domain can effectively maintain detailed information. Finally, we adopt the inverse discrete wavelet transform (IDWT) to reconstruct these attentive features. Extensive experiments on publicly available datasets demonstrate the efficiency and effectiveness of our AFLN against the state-of-the-art methods.
Year
DOI
Venue
2022
10.1007/s10489-021-02703-w
Applied Intelligence
Keywords
DocType
Volume
Super-resolution, Self-attention, Wavelet transform, Frequency domain
Journal
52
Issue
ISSN
Citations 
5
0924-669X
0
PageRank 
References 
Authors
0.34
5
6
Name
Order
Citations
PageRank
Fenghai Li100.68
Qiang Cai200.34
Haisheng Li31010.14
Yifan Chen447470.39
Jian Cao555794.92
Shanshan Li629553.11