Abstract | ||
---|---|---|
We propose a Dynamic Graph-Based Spatial-Temporal Attention (DG-STA) method for hand gesture recognition. The key idea is to first construct a fully-connected graph from a hand skeleton, where the node features and edges are then automatically learned via a self-attention mechanism that performs in both spatial and temporal domains. We further propose to leverage the spatial-temporal cues of joint positions to guarantee robust recognition in challenging conditions. In addition, a novel spatial-temporal mask is applied to significantly cut down the computational cost by 99%. We carry out extensive experiments on benchmarks (DHG-14/28 and SHREC'17) and prove the superior performance of our method compared with the state-of-the-art methods. The source code can be found at https://github.com/yuxiaochen1103/DG-STA. |
Year | Venue | DocType |
---|---|---|
2019 | BMVC | Conference |
Citations | PageRank | References |
1 | 0.35 | 0 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yuxiao Chen | 1 | 10 | 3.84 |
Long Zhao | 2 | 30 | 6.23 |
Xi Peng | 3 | 123 | 13.67 |
Yuan Jianbo | 4 | 1 | 0.35 |
Dimitris N. Metaxas | 5 | 8834 | 952.25 |