Title
Prompt-Based Prototypical Framework for Continual Relation Extraction
Abstract
Continual relation extraction (CRE) is an important task of continual learning, which aims to learn incessantly emerging new relations between entities from texts. To avoid catastrophically forgetting old relations, some existing research efforts have focused on exploring memory replayed methods by storing typical historical learned instances or embedding all observed relations as prototypes in the episodic memory and replaying them in the subsequent training process. However, they generally fail to exploit the relation knowledge contained in the pre-trained language model (PLM), which could provide enlightening information to the representations of new relations from the known ones. To this end, we investigate the CRE from a novel perspective by generating knowledge-infused relation prototypes to leverage the relational knowledge from PLM with prompt tuning. Specifically, based on the typical samples collected from the historical learned instances with K-means algorithm, we devise novel relational knowledge-infused prompts to elicit relational knowledge from PLM for generating knowledge-infused relation prototypes. Then the prototypes are used to refine the typical examples embedding and calculate the stability-plasticity balance score for adjusting the memory replayed progress. The experimental results show that our method outperforms the state-of-the-art baseline models in CRE. The further extensive analysis presents that the proposed method is robust to memory size, task order, length of the task sequence, and the number of training instances.
Year
DOI
Venue
2022
10.1109/TASLP.2022.3199655
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING
Keywords
DocType
Volume
Task analysis, Prototypes, Stability analysis, Feature extraction, Training, Speech processing, Security, Continual learning, relation extraction, catastrophic forgetting, prompt method, prototype
Journal
30
Issue
ISSN
Citations 
1
2329-9290
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Han Zhang1920.42
Liang Bin223954.58
Min Yang315541.56
Hui Wang417743.68
Xu Ruifeng543253.04