Title
Sgap-Net: Semantic-Guided Attentive Prototypes Network For Few-Shot Human-Object Interaction Recognition
Abstract
Extreme instance imbalance among categories and combinatorial explosion make the recognition of Human-Object Interaction (HOI) a challenging task. Few studies have addressed both challenges directly. Motivated by the success of few-shot learning that learns a robust model from a few instances, we formulate HOI as a few-shot task in a meta-learning framework to alleviate the above challenges. Due to the fact that the intrinsic characteristic of HOI is diverse and interactive, we propose a Semantic-Guided Attentive Prototypes Network (SGAP-Net) to learn a semantic-guided metric space where HOI recognition can be performed by computing distances to attentive prototypes of each class. Specifically, the model generates attentive prototypes guided by the category names of actions and objects, which highlight the commonalities of images from the same class in HOI. In addition, we design a novel decision method to alleviate the biases produced by different patterns of the same action in HOI. Finally, in order to realize the task of few-shot HOI, we reorganize two HOI benchmark datasets, i.e., HICO-FS and TUHOI-FS, to realize the task of few-shot HOI. Extensive experimental results on both datasets have demonstrated the effectiveness of our proposed SGAP-Net approach.
Year
Venue
DocType
2020
AAAI
Conference
Volume
ISSN
Citations 
34
2159-5399
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Zhong Ji116923.08
Liu Xiyao254.17
Yanwei Pang319214.61
Xuelong Li415049617.31