Title
Indoor scene texturing based on single mobile phone images and 3D model fusion
Abstract
Realistic texture mapping and coherent up-to-date rendering is one of the most important issues in indoor 3-D modelling. However, existing texturing approaches are usually performed manually during the modelling process, and cannot accommodate changes in indoor environments occurring after the model was created, resulting in out-dated and misleading texture rendering. In this study, a structured learning-based texture mapping method is proposed for automatic mapping a single still photo from a mobile phone onto an already-constructed indoor 3-D model. The up-to-date texture is captured using a smart phone, and the indoor structural layout is extracted by incorporating per-pixel segmentation in the FCN algorithm and the line constraints into a structured learning algorithm. This enables real-time texture mapping according to parts of the model, based on the structural layout. Furthermore, the rough camera pose is estimated by pedestrian dead reckoning (PDR) and map information to determine where to map the texture. The experimental results presented in this paper demonstrate that our approach can achieve accurate fusion of 3-D triangular meshes with 2-D single images, achieving low-cost and automatic indoor texture updating. Based on this fusion approach, users can have a better experience in virtual indoor3-D applications.
Year
DOI
Venue
2019
10.1080/17538947.2018.1456569
INTERNATIONAL JOURNAL OF DIGITAL EARTH
Keywords
Field
DocType
Mobile phone image,FCN,texture updating,indoor 3-D model,augmented reality
Texture mapping,Data mining,Computer vision,Pedestrian,Segmentation,Structured prediction,Augmented reality,Dead reckoning,Artificial intelligence,Mobile phone,Rendering (computer graphics),Geography
Journal
Volume
Issue
ISSN
12.0
5.0
1753-8947
Citations 
PageRank 
References 
0
0.34
18
Authors
5
Name
Order
Citations
PageRank
hanjiang xiong102.03
Wei Ma21310.72
Xianwei Zheng3144.75
Jianya Gong454157.06
Douadi Abdelalim500.34