Abstract | ||
---|---|---|
This work presents the development and implementation of a unified multi-sensor human motion capture and gesture recognition system that can distinguish between and classify six different gestures. Data was collected from eleven participants using a subset of five wireless motion sensors (inertial measurement units) attached to their arms and upper body from a complete motion capture system. We compare Support Vector Machines and Artificial Neural Networks on the same dataset under two different scenarios and evaluate the results. Our study indicates that near perfect classification accuracies are achievable for small gestures and that the speed of classification is sufficient to allow interactivity. However, such accuracies are more difficult to obtain when a participant does not participate in training, indicating that more work needs to be done in this area to create a system that can be used by the general population. |
Year | DOI | Venue |
---|---|---|
2016 | 10.3390/s16050605 | SENSORS |
Keywords | Field | DocType |
gesture recognition,wearable sensors,quaternions,pattern analysis,machine learning,support vector machines,artificial neural networks | Motion capture,Computer vision,Population,Units of measurement,Gesture,Wearable computer,Computer science,Support vector machine,Gesture recognition,Artificial intelligence,Artificial neural network | Journal |
Volume | Issue | ISSN |
16 | 5.0 | 1424-8220 |
Citations | PageRank | References |
2 | 0.37 | 10 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shamir Alavi | 1 | 2 | 0.37 |
Dennis Arsenault | 2 | 2 | 0.37 |
Anthony Whitehead | 3 | 143 | 20.84 |