Title
Self-Guided Novel View Synthesis via Elastic Displacement Network
Abstract
Synthesizing a novel view from different viewpoints has been an essential problem in 3D vision. Among a variety of view synthesis tasks, single image based view synthesis is particularly challenging. Recent works address this problem by a fixed number of image planes of discrete disparities, which tend to generate structurally inconsistent results on wide-baseline, scene-complicated datasets such as KITTI. In this paper, we propose the Self-Guided Elastic Displacement Network (SG-EDN), which explicitly models the geometric transformation by a novel non-discrete scene representation called layered displacement maps (LDM). To generate realistic views, we exploit the positional characteristics of the displacement maps and design a multi-scale structural pyramid for self-guided filtering on the displacement maps. To optimize efficiency and scene-adaptivity, we allow the effective range of each displacement map to be `elastic', with fully learnable parameters. Experimental results confirm that our framework outperforms existing methods in both quantitative and qualitative tests.
Year
DOI
Venue
2020
10.1109/WACV45572.2020.9093472
2020 IEEE Winter Conference on Applications of Computer Vision (WACV)
Keywords
DocType
ISSN
image planes,discrete disparities,scene-complicated datasets,displacement map,multiscale structural pyramid,view synthesis tasks,single image based view synthesis,nondiscrete scene representation,elastic displacement network,3D vision,KITTI scene-complicated datasets,self-guided filtering,scene-adaptivity,quantitative tests,qualitative tests
Conference
2472-6737
ISBN
Citations 
PageRank 
978-1-7281-6554-7
1
0.35
References 
Authors
17
4
Name
Order
Citations
PageRank
Yicun Liu111.36
Jiawei Zhang211111.52
Ye Ma310.35
Jimmy S. J. Ren432423.85