Title
Deep Exemplar-Based Video Colorization
Abstract
This paper presents the first end-to-end network for exemplar-based video colorization. The main challenge is to achieve temporal consistency while remainingfaithful to the reference style. To address this issue, we introduce a recurrentframework that unifies the semantic correspondence and color propagationsteps. Both steps allow a provided reference image to guide the colorization of every frame, thus reducing accumulatedpropagationerrors. Video frames are colorized in sequence based on the colorization history, and its coherency is further enforced by the temporal consistency loss. All of these components, learnt end-to-end, help produce realistic videos with good temporal stability. Experiments show our result is superior to the state-of-the-artmethods both quantitativelyand qualitatively.
Year
DOI
Venue
2019
10.1109/CVPR.2019.00824
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
DocType
Volume
ISSN
Conference
abs/1906.09909
1063-6919
Citations 
PageRank 
References 
4
0.39
0
Authors
7
Name
Order
Citations
PageRank
Bo Zhang1225.68
Mingming He2342.91
Jing Liao318225.81
Pedro V. Sander4111163.92
Lu Yuan580148.29
Amine Bermak649390.25
Dong Chen768132.51