Title
Video Extrapolation Using Neighboring Frames
Abstract
With the popularity of immersive display systems that fill the viewer’s field of view (FOV) entirely, demand for wide FOV content has increased. A video extrapolation technique based on reuse of existing videos is one of the most efficient ways to produce wide FOV content. Extrapolating a video poses a great challenge, however, due to the insufficient amount of cues and information that can be leveraged for the estimation of the extended region. This article introduces a novel framework that allows the extrapolation of an input video and consequently converts a conventional content into one with wide FOV. The key idea of the proposed approach is to integrate the information from all frames in the input video into each frame. Utilizing the information from all frames is crucial because it is very difficult to achieve the goal with a two-dimensional transformation based approach when parallax caused by camera motion is apparent. Warping guided by three-dimensnional scene points matches the viewpoints between the different frames. The matched frames are blended to create extended views. Various experiments demonstrate that the results of the proposed method are more visually plausible than those produced using state-of-the-art techniques.
Year
DOI
Venue
2019
10.1145/3196492
ACM Transactions on Graphics
Keywords
Field
DocType
Peripheral vision, immersive content, video extrapolation
Field of view,Computer vision,Image warping,Parallax,Reuse,Viewpoints,Popularity,Extrapolation,Peripheral vision,Artificial intelligence,Mathematics
Journal
Volume
Issue
ISSN
38
3
0730-0301
Citations 
PageRank 
References 
1
0.36
0
Authors
5
Name
Order
Citations
PageRank
Sangwoo Lee15315.00
jungjin lee2303.05
Bumki Kim3262.38
Kyehyun Kim41238.29
Jun-yong Noh5113.58