Title
Arbitrary Style Transfer With Parallel Self-Attention
Abstract
Neural style transfer aims to create artistic images by synthesizing patterns from a given style image. Recently, the Adaptive Instance Normalization (AdaIN) layer is proposed to achieve real-time arbitrary style transfer. However, we observed that if crucial features based on AdaIN can be further emphasized during transfer, both content and style information will be better reflected in stylized images. Furthermore, it is always essential to preserve more details and reduce unexpected artifacts in order to generate appealing results. In this paper, we introduce an improved arbitrary style transfer method based on the self-attention mechanism. A self-attention module is designed to learn what and where to emphasize in the input image. In addition, an extra Laplacian loss is applied to preserve structure details of the content while eliminating artifacts. Experimental results demonstrate that the proposed method outperforms AdaIN and can generate more appealing results.
Year
DOI
Venue
2020
10.1109/ICPR48806.2021.9412049
2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR)
Keywords
DocType
ISSN
style transfer, attention mechanism, instance normalization, laplacian matrix
Conference
1051-4651
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Tiange Zhang102.37
Ying Gao2428.50
Feng Gao332.43
Lin Qi4278.68
Junyu Dong59923.43