Title
Reconstructing animated meshes from time-varying point clouds
Abstract
In this paper, we describe a novel approach for the reconstruction of animated meshes from a series of time-deforming point clouds. Given a set of unordered point clouds that have been captured by a fast 3-D scanner, our algorithm is able to compute coherent meshes which approximate the input data at arbitrary time instances. Our method is based on the computation of an implicit function in R4 that approximates the time-space surface of the time-varying point cloud. We then use the four-dimensional implicit function to reconstruct a polygonal model for the first time-step. By sliding this template mesh along the time-space surface in an as-rigid-as-possible manner, we obtain reconstructions for further time-steps which have the same connectivity as the previously extracted mesh while recovering rigid motion exactly. The resulting animated meshes allow accurate motion tracking of arbitrary points and are well suited for animation compression. We demonstrate the qualities of the proposed method by applying it to several data sets acquired by real-time 3-D scanners.
Year
DOI
Venue
2008
10.1111/j.1467-8659.2008.01287.x
Comput. Graph. Forum
Keywords
Field
DocType
accurate motion tracking,arbitrary point,time-space surface,time-varying point cloud,arbitrary time instance,animated mesh,unordered point cloud,time-deforming point cloud,fast 3-d scanner,coherent mesh,point cloud
Computer vision,Polygon,Polygon mesh,Computer science,Volume mesh,Implicit function,Theoretical computer science,Artificial intelligence,Point cloud,Match moving,Static mesh,Computation
Journal
Volume
Issue
ISSN
27
5
0167-7055
Citations 
PageRank 
References 
41
1.46
15
Authors
3
Name
Order
Citations
PageRank
Jochen Süßmuth113811.30
Marco Winter2442.56
Günther Greiner359880.74