Title
Stable and Controllable Neural Texture Synthesis and Style Transfer Using Histogram Losses.
Abstract
Recently, methods have been proposed that perform texture synthesis and style transfer by using convolutional neural networks (e.g. Gatys et al. [2015,2016]). These methods are exciting because they can in some cases create results with state-of-the-art quality. However, in this paper, we show these methods also have limitations in texture quality, stability, requisite parameter tuning, and lack of user controls. This paper presents a multiscale synthesis pipeline based on convolutional neural networks that ameliorates these issues. We first give a mathematical explanation of the source of instabilities in many previous approaches. We then improve these instabilities by using histogram losses to synthesize textures that better statistically match the exemplar. We also show how to integrate localized style losses in our multiscale framework. These losses can improve the quality of large features, improve the separation of content and style, and offer artistic controls such as paint by numbers. We demonstrate that our approach offers improved quality, convergence in fewer iterations, and more stability over the optimization.
Year
Venue
Field
2017
arXiv: Graphics
Convergence (routing),Histogram,Computer vision,Computer science,Convolutional neural network,Algorithm,Artificial intelligence,Texture synthesis
DocType
Volume
Citations 
Journal
abs/1701.08893
14
PageRank 
References 
Authors
0.61
9
3
Name
Order
Citations
PageRank
Pierre Wilmot1140.61
Eric Risser2141.28
Connelly Barnes3172959.07