Title
Enhanced prototypical network for few-shot relation extraction
Abstract
Most existing methods for relation extraction tasks depend heavily on large-scale annotated data; they cannot learn from existing knowledge and have low generalization ability. It is urgent for us to solve the above problems by further developing few-shot learning methods. Because of the limitations of the most commonly used CNN model which is not good at sequence labeling and capturing long-range dependencies, we proposed a novel model that integrates the transformer model into a prototypical network for more powerful relation-level feature extraction. The transformer connects tokens directly to adapt to long sequence learning without catastrophic forgetting and is able to gain more enhanced semantic information by learning from several representation subspaces in parallel for each word. We evaluate our method on three tasks, including in-domain, cross-domain and cross-sentence tasks. Our method achieves a trade-off between performance and computation and has an approximately 8% improvement in different settings over the state-of-the-art prototypical network. In addition, our experiments also show that our approach is competitive when considering cross-domain transfer and cross-sentence relation extraction in few-shot learning methods.
Year
DOI
Venue
2021
10.1016/j.ipm.2021.102596
Information Processing & Management
Keywords
DocType
Volume
Few-shot learning,Transformer,Relation extraction
Journal
58
Issue
ISSN
Citations 
4
0306-4573
1
PageRank 
References 
Authors
0.48
0
5
Name
Order
Citations
PageRank
Wen Wen110.48
Yongbin Liu25811.05
Chunping Ouyang363.35
Qiang Lin410.48
Tonglee Chung510.48