Title
MEgATrack: monochrome egocentric articulated hand-tracking for virtual reality
Abstract
AbstractWe present a system for real-time hand-tracking to drive virtual and augmented reality (VR/AR) experiences. Using four fisheye monochrome cameras, our system generates accurate and low-jitter 3D hand motion across a large working volume for a diverse set of users. We achieve this by proposing neural network architectures for detecting hands and estimating hand keypoint locations. Our hand detection network robustly handles a variety of real world environments. The keypoint estimation network leverages tracking history to produce spatially and temporally consistent poses. We design scalable, semi-automated mechanisms to collect a large and diverse set of ground truth data using a combination of manual annotation and automated tracking. Additionally, we introduce a detection-by-tracking method that increases smoothness while reducing the computational cost; the optimized system runs at 60Hz on PC and 30Hz on a mobile processor. Together, these contributions yield a practical system for capturing a user's hands and is the default feature on the Oculus Quest VR headset powering input and social presence.
Year
DOI
Venue
2020
10.1145/3386569.3392452
ACM Transactions on Graphics
Keywords
DocType
Volume
motion capture, hand tracking, virtual reality
Journal
39
Issue
ISSN
Citations 
4
0730-0301
2
PageRank 
References 
Authors
0.37
0
16
Name
Order
Citations
PageRank
Shangchen Han141.08
Beibei Liu2111.25
Randi Cabezas321.04
Christopher D. Twigg420.37
Peizhao Zhang520.37
Jeff Petkau620.37
Tsz-Ho Yu720.37
Chun-Jung Tai821.04
Muzaffer Akbay920.37
Zheng Wang1020.37
Asaf Nitzan1120.37
Dong Gang12455.07
Yuting Ye1317910.18
Lingling Tao1420.71
Chengde Wan1520.37
Robert Y. Wang1654426.88