Title
Function-words Adaptively Enhanced Attention Networks for Few-Shot Inverse Relation Classification.
Abstract
The relation classification is to identify semantic relations between two entities in a given text. While existing models perform well for classifying inverse relations with large datasets, their performance is significantly reduced for few-shot learning. In this paper, we propose a function words adaptively enhanced attention framework (FAEA) for few-shot inverse relation classification, in which a hybrid attention model is designed to attend class-related function words based on meta-learning. As the involvement of function words brings in significant intra-class redundancy, an adaptive message passing mechanism is introduced to capture and transfer inter-class differences.We mathematically analyze the negative impact of function words from dot-product measurement, which explains why the message passing mechanism effectively reduces the impact. Our experimental results show that FAEA outperforms strong baselines, especially the inverse relation accuracy is improved by 14.33% under 1-shot setting in FewRel1.0.
Year
DOI
Venue
2022
10.24963/ijcai.2022/407
European Conference on Artificial Intelligence
Keywords
DocType
Citations 
Machine Learning: Few-shot learning,Natural Language Processing: Text Classification
Conference
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Chunliu Dou100.34
Shaojuan Wu202.03
Xiaowang Zhang316338.77
Zhiyong Feng4794167.21
王克文559154.88