Title
Neural Pose Transfer By Spatially Adaptive Instance Normalization
Abstract
Pose transfer has been studied for decades, in which the pose of a source mesh is applied to a target mesh. Particularly in this paper, we are interested in transferring the pose of source human mesh to deform the target human mesh, while the source and target meshes may have different identity information. Traditional studies assume that the paired source and target meshes are existed with the point-wise correspondences of user annotated landmarks/mesh points, which requires heavy labelling efforts. On the other hand, the generalization ability of deep models is limited, when the source and target meshes have different identities. To break this limitation, we proposes the first neural pose transfer model that solves the pose transfer via the latest technique for image style transfer, leveraging the newly proposed component - spatially adaptive instance normalization. Our model does not require any correspondences between the source and target meshes. Extensive experiments show that the proposed model can effectively transfer deformation from source to target meshes, and has good generalization ability to deal with unseen identities or poses of meshes. Code is available at https://github.com/jiashunwang/Neural-Pose-Transfer.
Year
DOI
Venue
2020
10.1109/CVPR42600.2020.00587
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)
DocType
ISSN
Citations 
Conference
1063-6919
1
PageRank 
References 
Authors
0.34
18
7
Name
Order
Citations
PageRank
Wang Jiashun110.34
Chao Wen2176.00
Yanwei Fu354351.93
Lin Haitao410.34
Zou Tianyun510.34
Xiangyang Xue62466154.25
Yinda Zhang735024.48