Title
A real-time system for motion retrieval and interpretation
Abstract
This paper proposes a new examplar-based method for real-time human motion recognition using Motion Capture (MoCap) data. We have formalized streamed recognizable actions, coming from an online MoCap engine, into a motion graph that is similar to an animation motion graph. This graph is used as an automaton to recognize known actions as well as to add new ones. We have defined and used a spatio-temporal metric for similarity measurements to achieve more accurate feedbacks on classification. The proposed method has the advantage of being linear and incremental, making the recognition process very fast and the addition of a new action straightforward. Furthermore, actions can be recognized with a score even before they are fully completed. Thanks to the use of a skeleton-centric coordinate system, our recognition method has become view-invariant. We have successfully tested our action recognition method on both synthetic and real data. We have also compared our results with four state-of-the-art methods using three well known datasets for human action recognition. In particular, the comparisons have clearly shown the advantage of our method through better recognition rates.
Year
DOI
Venue
2013
10.1016/j.patrec.2012.12.020
Pattern Recognition Letters
Keywords
Field
DocType
new examplar-based method,recognition method,real-time human motion recognition,action recognition method,motion retrieval,recognition process,real-time system,animation motion graph,better recognition rate,state-of-the-art method,human action recognition,automaton,motion capture
Coordinate system,Computer vision,Motion capture,Graph,Pattern recognition,Computer science,Automaton,Action recognition,Real-time operating system,Human motion,Animation,Artificial intelligence
Journal
Volume
Issue
ISSN
34
15
0167-8655
Citations 
PageRank 
References 
8
0.52
31
Authors
4
Name
Order
Citations
PageRank
Mathieu Barnachon1582.17
Saïda Bouakaz216513.33
Boubakeur Boufama316222.02
Erwan Guillou4835.10