Title
Generation of view-dependent textures for an inaccurate model
Abstract
Existing methods of texture generation from registered images only work well on accurate models. For inaccurate models, texture drifts may occur. In this paper, we propose a view-dependent seamless texture generation method for inaccurate models. Under a specific viewpoint, this method first assigns each mesh face with a label associated with a registered image to generate a primitive texture. Different from previous methods, our label assignment process is dependent on the current view direction to reduce projective displacements of texture due to imprecision of the geometry. Then, a gradient-domain editing method is used to eliminate seams between image segments. If more than one observation views are given, the seam-levelling method further ensures color consistency between views. Experiments show that our method endows more sense of reality to inaccurate models and succeeds in maintaining temporal color consistency throughout a pre-defined sequence of viewpoints.
Year
DOI
Venue
2017
10.1109/ICIS.2017.7959974
2017 IEEE/ACIS 16th International Conference on Computer and Information Science (ICIS)
Keywords
Field
DocType
view dependent texturing,texture image generation,texture mapping
Computer vision,Projective texture mapping,Texture compression,Image texture,Bidirectional texture function,Computer science,Image segmentation,Solid modeling,Artificial intelligence,Texture atlas,Texture filtering
Conference
ISBN
Citations 
PageRank 
978-1-5090-5508-1
0
0.34
References 
Authors
16
2
Name
Order
Citations
PageRank
Zhen Wang100.68
Weidong Geng216221.81