Title
Walking on Thin Air: Environment-Free Physics-Based Markerless Motion Capture
Abstract
We propose a generative approach to physics-based motion capture. Unlike prior attempts to incorporate physics into tracking that assume the subject and scene geometry are calibrated and known a priori, our approach is automatic and online. This distinction is important since calibration of the environment is often difficult, especially for motions with props, uneven surfaces, or outdoor scenes. The use of physics in this context provides a natural framework to reason about contact and the plausibility of recovered motions. We propose a fast data-driven parametric body model, based on linear-blend skinning, which decouples deformations due to pose, anthropometrics and body shape. Pose (and shape) parameters are estimated using robust ICP optimization with physics-based dynamic priors that incorporate contact. Contact is estimated from torque trajectories and predictions of which contact points were active. To our knowledge, this is the first approach to take physics into account without explicit a priori knowledge of the environment or body dimensions. We demonstrate effective tracking from a noisy single depth camera, improving on state-of-the-art results quantitatively and producing better qualitative results, reducing visual artifacts like foot-skate and jitter.
Year
DOI
Venue
2018
10.1109/CRV.2018.00031
2018 15th Conference on Computer and Robot Vision (CRV)
Keywords
DocType
Volume
Computer Graphics,Computer Vision,Physics,3D Human Pose Tracking
Conference
abs/1812.01203
ISBN
Citations 
PageRank 
978-1-5386-6482-7
1
0.36
References 
Authors
18
4
Name
Order
Citations
PageRank
Micha Livne1332.28
Leonid Sigal22163124.33
Marcus A. Brubaker320817.33
David J. Fleet45236550.74