Title
MyoKey: Surface Electromyography and Inertial Motion Sensing-based Text Entry in AR
Abstract
The seamless textual input in Augmented Reality (AR) is very challenging and essential for enabling user-friendly AR applications. Existing approaches such as speech input and vision-based gesture recognition suffer from environmental obstacles and the large default keyboard size, sacrificing the majority of the screen's real estate in AR. In this paper, we propose MyoKey, a system that enables users to effectively and unobtrusively input text in a constrained environment of AR by jointly leveraging surface Electromyography (sEMG) and Inertial Motion Unit (IMU) signals transmitted by wearable sensors on a user's forearm. MyoKey adopts a deep learning-based classifier to infer hand gestures using sEMG. In order to show the feasibility of our approach, we implement a mobile AR application using the Unity application building framework. We present novel interaction and system designs to incorporate information of hand gestures from sEMG and arm motions from IMU to provide seamless text entry solution. We demonstrate the applicability of MyoKey by conducting a series of experiments achieving the accuracy of 0.91 on identifying five gestures in real-time (Inference time: 97.43 ms).
Year
DOI
Venue
2020
10.1109/PerComWorkshops48775.2020.9156084
2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)
Keywords
DocType
ISBN
Textual Input,Augmented Reality,EMG,IMU,Deep Learning
Conference
978-1-7281-4717-8
Citations 
PageRank 
References 
0
0.34
18
Authors
7
Name
Order
Citations
PageRank
Young D. Kwon100.34
Kirill A. Shatilov211.03
Lik Hang Lee3117.54
Serkan Kumyol400.34
Kit-Yung Lam541.40
Yui-Pan Yau641.73
Pan Hui700.34