Title
EarSense: Earphones as a Teeth Activity Sensor
Abstract
This paper finds that actions of the teeth, namely tapping and sliding, produce vibrations in the jaw and skull. These vibrations are strong enough to propagate to the edge of the face and produce vibratory signals at an earphone. By re-tasking the earphone speaker as an input transducer - a software modification in the sound card - we are able to sense teeth-related gestures across various models of ear/headphones. In fact, by analyzing the signals at the two earphones, we show the feasibility of also localizing teeth gestures, resulting in a human-to-machine interface. Challenges range from coping with weak signals, distortions due to different teeth compositions, lack of timing resolution, spectral dispersion, etc. We address these problems with a sequence of sensing techniques, resulting in the ability to detect 6 distinct gestures in real-time. Results from 18 volunteers exhibit robustness, even though our system - EarSense - does not depend on per-user training. Importantly, EarSense also remains robust in the presence of concurrent user activities, like walking, nodding, cooking and cycling. Our ongoing work is focused on detecting teeth gestures even while music is being played in the earphone; once that problem is solved, we believe EarSense could be even more compelling.
Year
DOI
Venue
2020
10.1145/3372224.3419197
MobiCom '20: The 26th Annual International Conference on Mobile Computing and Networking London United Kingdom September, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-7085-1
1
PageRank 
References 
Authors
0.35
19
5
Name
Order
Citations
PageRank
J. Prakash1103.91
Z. Yang2101.65
Y. Wei320.71
H. Hassanieh459137.63
Romit Roy Choudhury53951233.31