Title
Deformation representation based convolutional mesh autoencoder for 3D hand generation
Abstract
Due to its flexible joints and self-occlusion, representation and reconstruction of 3D human hand is a very challenging problem. Although some parametric models have been proposed to alleviate this problem, these representation models have limited representation ability, like not being able to represent complex gestures. In this paper, we presented a new 3D hand model with powerful representation ability and applied it to high accuracy monocular RGB-D/RGB 3D hand reconstruction. To achieve this, we firstly build a large scale high-quality hand mesh data set based on MANO with a novel mesh deformation method. We train a VAE based on this data set, and get the low-dimensional representation of hand meshes. By using our HandVAE model, we can recover a 3D human hand by giving a code within this latent space. We also build a framework to recover 3D hand mesh from RGB-D/RGB data. Experimental results have demonstrated the powerfulness of our hand model in terms of the reconstruction accuracy and the application for RGB-D/RGB reconstruction. We believe that our 3D hand representation could be further used in other related human hand applications.
Year
DOI
Venue
2021
10.1016/j.neucom.2020.01.122
Neurocomputing
Keywords
DocType
Volume
3D hand model,Deformation representation,Variational autoencoder,Monocular 3D hand reconstruction
Journal
444
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Xinqian Zheng100.34
Boyi Jiang2172.06
Juyong Zhang337934.08