Title
Hand Gesture Classification Using Non-Audible Sound
Abstract
Recognizing and distinguishing the behavior and gesture of a user has become important owing to an increase in the use of wearable devices, such as a smartwatch. This study aims to propose a method for classifying hand gestures by creating sound in the non-audible frequency range using a smartphone and reflected signal. The proposed method converts the sound data which has been reflected and recorded, into an image within a short time using short time Fourier transform, and the obtained data are applied to a convolutional neural network (CNN) model to classify hand gestures. The results showed classification accuracy for 8 hand gestures with an average of 87.75%. Additionally, it is confirmed that the suggested method has a higher classification accuracy than other machine learning classification algorithms.
Year
DOI
Venue
2019
10.1109/ICUFN.2019.8806145
2019 ELEVENTH INTERNATIONAL CONFERENCE ON UBIQUITOUS AND FUTURE NETWORKS (ICUFN 2019)
Keywords
Field
DocType
non-audible sound, hand gesture, gesture classification, convolutional neural network, shoat-time fourier transform
Pattern recognition,Convolutional neural network,Computer science,Gesture,Short-time Fourier transform,Fourier transform,Artificial intelligence,Statistical classification,Wearable technology,Smartwatch,Distributed computing,Gesture classification
Conference
ISSN
Citations 
PageRank 
2165-8528
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Jinhyuck Kim100.34
Jinwon Cheon200.34
Sunwoong Choi311215.89