Title
Human interaction categorization by using audio-visual cues
Abstract
Human Interaction Recognition (HIR) in uncontrolled TV video material is a very challenging problem because of the huge intra-class variability of the classes (due to large differences in the way actions are performed, lighting conditions and camera viewpoints, amongst others) as well as the existing small inter-class variability (e.g., the visual difference between hug and kiss is very subtle). Most of previous works have been focused only on visual information (i.e., image signal), thus missing an important source of information present in human interactions: the audio. So far, such approaches have not shown to be discriminative enough. This work proposes the use of Audio-Visual Bag of Words (AVBOW) as a more powerful mechanism to approach the HIR problem than the traditional Visual Bag of Words (VBOW). We show in this paper that the combined use of video and audio information yields to better classification results than video alone. Our approach has been validated in the challenging TVHID dataset showing that the proposed AVBOW provides statistically significant improvements over the VBOW employed in the related literature.
Year
DOI
Venue
2014
10.1007/s00138-013-0521-1
Mach. Vis. Appl.
Keywords
Field
DocType
Human interactions,Audio,Video,BOW
Bag-of-words model,Sensory cue,Computer vision,Categorization,Viewpoints,Computer science,Human interaction,Artificial intelligence,Kiss,Image signal,Discriminative model
Journal
Volume
Issue
ISSN
25
1
0932-8092
Citations 
PageRank 
References 
11
0.50
31
Authors
4
Name
Order
Citations
PageRank
Manuel J. Marín-Jiménez172041.48
R. Muñoz-Salinas230413.89
E. Yeguas-Bolivar3522.80
N. Pérez de la Blanca410110.33