Title
Demo: Complex human gestures encoding from wearable inertial sensors for activity recognition.
Abstract
We demonstrate a method to encode complex human gestures acquired from inertial sensors for activity recognition. Gestures are encoded as a stream of symbols which represent the change in orientation and displacement of the body limbs over time. The first novelty of this encoding is to enable the reuse previously developed single-channel template matching algorithms also when multiple sensors are used simultaneously. The second novelty is to encode changes in orientation of limbs which is important in some activities, such as sport analytics. We demonstrate the method using our custom inertial platform, BlueSense. Using a set of five BlueSense nodes, we implemented a motion tracking system that displays a 3D human model and shows in real-time the corresponding movement encoding.
Year
Venue
Field
2018
EWSN
Template matching,Computer vision,ENCODE,Activity recognition,Computer science,Gesture,Inertial platform,Real-time computing,Inertial measurement unit,Artificial intelligence,Novelty,Match moving
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Mathias Ciliberto1336.12
Luis Ponce Cuspinera271.39
Daniel Roggen31851137.05