Title
Human-Human Interaction Recognition Based On Spatial And Motion Trend Feature
Abstract
Human-human interaction recognition has attracted increasing attention in recent years due to its wide applications in computer vision fields. Currently there are few publicly available RGBD-based human-human interaction datasets collected. This paper introduces a new dataset for human-human interaction recognition. Furthermore, a novel feature descriptor based on spatial relationship and semantic motion trend similarity between body parts is proposed for human-human interaction recognition. The motion trend of each skeleton joint is firstly quantified into the specific semantic word and then a Kernel is built for measuring the similarity of either intra or inter body parts by histogram interaction. Finally, the proposed feature descriptor is evaluated on the SBU interaction dataset and the collected dataset. Experimental results demonstrate the outperformance of our method over the state-of-the-art methods.
Year
Venue
Keywords
2017
2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)
Human-human interaction, Action recognition, Semantic moving words, RGBD dataset
Field
DocType
ISSN
Kernel (linear algebra),Histogram,Computer vision,Feature descriptor,Pattern recognition,Computer science,Spatial relationship,Feature extraction,Human interaction,Artificial intelligence,Semantics
Conference
1522-4880
Citations 
PageRank 
References 
1
0.35
0
Authors
4
Name
Order
Citations
PageRank
Bangli Liu1112.85
Haibin Cai2386.46
Xiaofei Ji31548.57
Honghai Liu41974178.69