Title
Explicit Filterbank Learning for Neural Image Style Transfer and Image Processing
Abstract
Image style transfer is to re-render the content of one image with the style of another. Most existing methods couple content and style information in their network structures and hyper-parameters, and learn it as a black-box. For better understanding, this paper aims to provide a new explicit decoupled perspective. Specifically, we propose StyleBank, which is composed of multiple convolution filter banks and each filter bank explicitly represents one style. To transfer an image to a specific style, the corresponding filter bank is operated on the intermediate feature produced by a single auto-encoder. The StyleBank and the auto-encoder are jointly learnt in such a way that the auto-encoder does not encode any style information. This explicit representation also enables us to conduct incremental learning to add a new style and fuse styles at not only the image level, but also the region level. Our method is the first style transfer network that links back to traditional texton mapping methods, and provides new understanding on neural style transfer. We further apply this general filterbank learning idea to two different multi-parameter image processing tasks: edge-aware image smoothing and denoising. Experiments demonstrate that it can achieve comparable results to its single parameter setting counterparts.
Year
DOI
Venue
2021
10.1109/TPAMI.2020.2964205
IEEE Transactions on Pattern Analysis and Machine Intelligence
Keywords
DocType
Volume
Algorithms,Image Processing, Computer-Assisted
Journal
43
Issue
ISSN
Citations 
7
0162-8828
1
PageRank 
References 
Authors
0.35
11
5
Name
Order
Citations
PageRank
Dongdong Chen15219.10
Lu Yuan280148.29
Jing Liao318225.81
Nenghai Yu42238183.33
Gang Hua52796157.90