Title
Walk a Robot Dog in VR!
Abstract
Realistic locomotion in a virtual environment (VE) can help maximize immersion and decrease simulator sickness. Redirected walking (RDW) allows a user to physically walk in VR by rotating the VE as a function of head rotation such that they walk in an arc that fits in the tracking area. However, this requires significant user rotation, often requiring a “distractor” to cause such rotation in commercial tracking spaces. Previous implementations suddenly spawned a distractor (e.g. butterfly) when the user walks near the safe boundary, with limitations like the user causing distraction accidentally by looking around, the distractor not being acknowledged, or getting “stuck” in a corner. We explore a persistent, robot distractor tethered to the user that provides two-way haptic feedback and natural motion constraints. We design a dynamic robot AI which adapts to randomness in the user’s behavior, as well as trajectory changes caused by tugging on its leash. The robot tries to imperceptibly keep the user safe by replicating a real dog’s behaviors, such as barking or sniffing something. We hypothesize that the naturalness of the dog behavior, its responses to the user, and the haptic tethering will work together to allow the user to explore the entire city, ideally without noticing that the dog is a robot.
Year
DOI
Venue
2020
10.1145/3388536.3407897
SIGGRAPH '20: Special Interest Group on Computer Graphics and Interactive Techniques Conference Virtual Event USA August, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-7968-7
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Nicholas Rewkowski1154.67
Ming Lin27046525.99