Title
Exploring human activity annotation using a privacy preserving 3D model.
Abstract
Annotating activity recognition datasets is a very time consuming process. Using lay annotators (e.g. using crowd-sourcing) has been suggested to speed this up. However, this requires to preserve privacy of users and may preclude relying on video for annotation. We investigate to which extent using a 3D human model animated from the data of inertial sensors placed on the limbs allows for annotation of human activities. We animate the upper body of the 3D model with the data from 5 inertial measurement sensors obtained from the OPPORTUNITY dataset. The animated model is shown to 6 people in a suite of experiments in order to understand to which extent it can be used for labelling. We present 3 experiments where we investigate the use of a 3D model for i) activity segmentation, ii) for \"open-ended\" annotation where users freely describe the activity they see on screen, and iii) traditional annotation, where users pick one activity among a pre-defined list of activities. In the latter case, results show that users recognise the model's activities with 56% accuracy when picking from 11 possible activities.
Year
DOI
Venue
2016
10.1145/2968219.2968290
UbiComp Adjunct
Keywords
Field
DocType
Activity recognition, annotation, wearable technologies, 3D human model
Annotation,Activity recognition,Suite,Segmentation,Computer science,Human–computer interaction,Inertial measurement unit,Wearable technology,Multimedia
Conference
Citations 
PageRank 
References 
2
0.43
10
Authors
3
Name
Order
Citations
PageRank
Mathias Ciliberto1336.12
Daniel Roggen21851137.05
Francisco Javier Ordóñez Morales3493.34