Title
RadarNet: Efficient Gesture Recognition Technique Utilizing a Miniature Radar Sensor
Abstract
ABSTRACTGestures are a promising candidate as an input modality for ambient computing where conventional input modalities such as touchscreens are not available. Existing works have focused on gesture recognition using image sensors. However, their cost, high battery consumption, and privacy concerns made cameras challenging as an always-on solution. This paper introduces an efficient gesture recognition technique using a miniaturized 60 GHz radar sensor. The technique recognizes four directional swipes and an omni-swipe using a radar chip (6.5 × 5.0 mm) integrated into a mobile phone. We developed a convolutional neural network model efficient enough for battery powered and computationally constrained processors. Its model size and inference time is less than 1/5000 compared to an existing gesture recognition technique using radar. Our evaluations with large scale datasets consisting of 558,000 gesture samples and 3,920,000 negative samples demonstrated our algorithm’s efficiency, robustness, and readiness to be deployed outside of research laboratories.
Year
DOI
Venue
2021
10.1145/3411764.3445367
Conference on Human Factors in Computing Systems
Keywords
DocType
Citations 
gesture recognition, mobile, deep learning, radar sensing
Conference
0
PageRank 
References 
Authors
0.34
0
8
Name
Order
Citations
PageRank
Eiji Hayashi163.50
Jaime Lien200.68
Nicholas Gillian3996.75
Leonardo Giusti400.34
Dave Weber511.04
Jin Yamanaka600.34
Lauren Bedal700.34
Ivan Poupyrev83373252.62