Title
Deep Stacked Bidirectional Lstm Neural Network For Skeleton-Based Action Recognition
Abstract
Skeleton-based action recognition has made great progress recently. However, many problems still remain unsolved. For example, the representations of skeleton sequences learned by most of the existing methods lack spatial structure information and detailed temporal dynamics features. To this end, we propose a novel Deep Stacked Bidirectional LSTM Network (DSB-LSTM) for human action recognition from skeleton data. Specifically, we first exploit human body geometry to extract the skeletal modulus ratio features (MR) and the skeletal vector angle features (VA) from the skeletal data. Then, the DSB-LSTM is applied to learning both the spatial and temporal representation from MR features and VA features. This network not only leads to more powerful representation but also stronger generalization capability. We perform several experiments on the MSR Action3D dataset, Florence 3D dataset and UTKinect-Action dataset. And the results show that our approach outperforms the compared methods on all datasets, demonstrating the effectiveness of the DSB-LSTM.
Year
DOI
Venue
2019
10.1007/978-3-030-34120-6_55
IMAGE AND GRAPHICS, ICIG 2019, PT I
Keywords
DocType
Volume
Deep learning, Skeleton-based action recognition, Bidirectional LSTM
Conference
11901
ISSN
Citations 
PageRank 
0302-9743
1
0.35
References 
Authors
0
4
Name
Order
Citations
PageRank
Kai Zou110.35
Yin Ming211423.30
Weitian Huang311.37
Yiqiu Zeng410.35