Title
A Transformational Biencoder with In-Domain Negative Sampling for Zero-Shot Entity Linking
Abstract
Recent interest in entity linking has focused in the zero-shot scenario, where at test time the entity mention to be labelled is never seen during training, or may belong to a different domain from the source domain. Current work leverage pre-trained BERT has the implicit assumption that BERT bridges the gap between the source and target domain distributions. However, finetuned BERT has a considerable underperformance at zero-shot when applied in a different domain. We solve this problem by proposing a Transformational Biencoder that incorporates a transformation into BERT to perform a zeroshot transfer from the source domain during training. As like previous work, we rely on negative entities to encourage our model to discriminate the golden entities during training. To generate these negative entities, we propose a simple but effective strategy that takes the domain of the golden entity into perspective. Our experimental results on the benchmark dataset Zeshel show effectiveness of our approach and achieve new state-of-the-art.
Year
DOI
Venue
2022
10.18653/v1/2022.findings-acl.114
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022)
DocType
Volume
Citations 
Conference
Findings of the Association for Computational Linguistics: ACL 2022
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Kai Sun163.52
Richong Zhang223239.67
Samuel Mensah3123.57
Yongyi Mao452461.02
Xudong Liu5769100.74