Title
KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
Abstract
ABSTRACT Recently, prompt-tuning has achieved promising results for specific few-shot classification tasks. The core idea of prompt-tuning is to insert text pieces (i.e., templates) into the input and transform a classification task into a masked language modeling problem. However, for relation extraction, determining an appropriate prompt template requires domain expertise, and it is cumbersome and time-consuming to obtain a suitable label word. Furthermore, there exists abundant semantic and prior knowledge among the relation labels that cannot be ignored. To this end, we focus on incorporating knowledge among relation labels into prompt-tuning for relation extraction and propose a Knowledge-aware Prompt-tuning approach with synergistic optimization (KnowPrompt). Specifically, we inject latent knowledge contained in relation labels into prompt construction with learnable virtual type words and answer words. Then, we synergistically optimize their representation with structured constraints. Extensive experimental results on five datasets with standard and low-resource settings demonstrate the effectiveness of our approach. Our code and datasets are available in GitHub1 for reproducibility.
Year
DOI
Venue
2022
10.1145/3485447.3511998
International World Wide Web Conference
Keywords
DocType
Citations 
Relation Extraction, Prompt-tuning, Knowledge-aware
Conference
1
PageRank 
References 
Authors
0.35
6
9
Name
Order
Citations
PageRank
Chen Xiang13135.72
Xin Xie211.02
Ningyu Zhang36318.56
Jiahuan Yan410.35
Shumin Deng53210.61
Chuanqi Tan6299.25
Fei Huang727.54
Luo Si82498169.52
Huanhuan Chen9731101.79