Title
Dilated Residual Networks with Symmetric Skip Connection for image denoising.
Abstract
Due to the fast inference and good performance, convolutional neural network (CNN) has been widely applied in image denoising. Some new approaches, such as residual learning and batch normalization are quite effective at accelerating the training process as well as improving accuracy. The batch normalization has been proved to handle Gaussian denoising effectively and efficiently, and can further boost the denoising performance of networks. However, it still leaves much space to improve. In this paper, we attempt to introduce a novel network structure without batch normalization, namely Dilated Residual Networks with Symmetric Skip Connection (DSNet), which depends on a combination of symmetric skip connection and dilated convolution. The advantage of this novel approach is computationally efficient in training, because the layers and the parameters of our networks are less than the previous network structure. Our network structure is more feasible for the task of image denoising, especially Gaussian noise. Our extensive experiments demonstrate that the proposed method can not only outperform the state-of-the-art methods in terms of both accuracy and speed, but also be efficiently implemented by benefiting from GPU computing.
Year
DOI
Venue
2019
10.1016/j.neucom.2018.12.075
Neurocomputing
Keywords
Field
DocType
Dilated Convolution,Skip Connection,Image denoising,Batch Normalization
Noise reduction,Residual,Normalization (statistics),Pattern recognition,Convolutional neural network,Convolution,Gaussian,General-purpose computing on graphics processing units,Artificial intelligence,Gaussian noise,Mathematics
Journal
Volume
ISSN
Citations 
345
0925-2312
4
PageRank 
References 
Authors
0.38
0
6
Name
Order
Citations
PageRank
Yali Peng11199.32
Lu Zhang216340.09
Shigang Liu3264.37
Xiaojun Wu4229.54
Yu Zhang541.73
Xili Wang6153.57