Title
Multisensory cues facilitate coordination of stepping movements with a virtual reality avatar.
Abstract
The effectiveness of simple sensory cues for retraining gait have been demonstrated, yet the feasibility of humanoid avatars for entrainment have yet to be investigated. Here, we describe the development of a novel method of visually cued training, in the form of a virtual partner, and investigate its ability to provide movement guidance in the form of stepping. Real stepping movements were mapped onto an avatar using motion capture data. The trajectory of one of the avatar step cycles was then accelerated or decelerated by 15% to create a perturbation. Healthy participants were motion captured while instructed to step in time to the avatar's movements, as viewed through a virtual reality headset. Step onset times were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included. Participants' asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both auditory-visual conditions. We conclude that an avatar's movements can be used to influence a person's own gait, but should include relevant auditory cues congruent with the movement to ensure a suitable accuracy is achieved.
Year
Venue
DocType
2019
CoRR
Journal
Volume
Citations 
PageRank 
abs/1906.09850
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Omar Khan100.68
Imran Ahmed232.09
Joshua Cottingham300.34
Musa Rahhal400.34
Theodoros N. Arvanitis500.34
Mark Elliott611.71