Title
Relation Extraction as Open-book Examination: Retrieval-enhanced Prompt Tuning
Abstract
Pre-trained language models have contributed significantly to relation extraction by demonstrating remarkable few-shot learning abilities. However, prompt tuning methods for relation extraction may still fail to generalize to those rare or hard patterns. Note that the previous parametric learning paradigm can be viewed as memorization regarding training data as a book and inference as the close-book test. Those long-tailed or hard patterns can hardly be memorized in parameters given few-shot instances. To this end, we regard RE as an open-book examination and propose a new semiparametric paradigm of retrieval-enhanced prompt tuning for relation extraction. We construct an open-book datastore for retrieval regarding prompt-based instance representations and corresponding relation labels as memorized key-value pairs. During inference, the model can infer relations by linearly interpolating the base output of PLM with the non-parametric nearest neighbor distribution over the datastore. In this way, our model not only infers relation through knowledge stored in the weights during training but also assists decision-making by unwinding and querying examples in the open-book datastore. Extensive experiments on benchmark datasets show that our method can achieve state-of-the-art in both standard supervised and few-shot settings
Year
DOI
Venue
2022
10.1145/3477495.3531746
SIGIR '22: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
Keywords
DocType
Citations 
Relation Extraction, Prompt Tuning, Few-shot Learning
Conference
0
PageRank 
References 
Authors
0.34
2
7
Name
Order
Citations
PageRank
Xiang Chen100.68
Li, Lei279969.54
Ningyu Zhang36318.56
Chuanqi Tan4299.25
Fei Huang527.54
Luo Si62498169.52
Huanhuan Chen7731101.79