Title
Deep Convolutional And Lstm Neural Network Architectures On Leap Motion Hand Tracking Data Sequences
Abstract
This paper focuses on the hand gesture recognition problem, in which input is a multidimensional time series signal acquired from a Leap Motion Sensor and output is a predefined set of gestures. In the present work, we propose the adoption of Convolutional Neural Networks (CNNs), either in combination with a Long Short-Term Memory (LSTM) neural network (i.e. CNN-LSTM), or standalone in a deep architecture (i.e. dCNN) to automate feature learning and classification from the raw input data. The learned features are considered as the higher level abstract representation of low level raw time series signals and are employed in a unified supervised learning and classification model. The proposed CNN-LSTM and deep CNN models demonstrate recognition rates of 94% on the Leap Motion Hand Gestures for Interaction with 3D Virtual Music Instruments dataset, which outperforms previously proposed models of handcrafted and automated learned features on LSTM networks.
Year
DOI
Venue
2019
10.23919/EUSIPCO.2019.8902973
2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO)
Keywords
Field
DocType
gesture recognition, all musical instrument interaction, CNN, LSTM, CNN-LSTM models
Convolutional neural network,Gesture,Leap motion,Computer science,Gesture recognition,Supervised learning,Speech recognition,Tracking data,Artificial neural network,Feature learning
Conference
ISSN
Citations 
PageRank 
2076-1465
0
0.34
References 
Authors
0
4