Title
DyLex - Incorporating Dynamic Lexicons into BERT for Sequence Labeling.
Abstract
Incorporating lexical knowledge into deep learning models has been proved to be very effective for sequence labeling tasks. However, previous works commonly have difficulty dealing with large-scale dynamic lexicons which often cause excessive matching noise and problems of frequent updates. In this paper, we propose DyLex, a plug-in lexicon incorporation approach for BERT based sequence labeling tasks. Instead of leveraging embeddings of words in the lexicon as in conventional methods, we adopt word-agnostic tag embeddings to avoid re-training the representation while updating the lexicon. Moreover, we employ an effective supervised lexical knowledge denoising method to smooth out matching noise. Finally, we introduce a col-wise attention based knowledge fusion mechanism to guarantee the pluggability of the proposed framework. Experiments on ten datasets of three tasks show that the proposed framework achieves new SOTA, even with very large scale lexicons.
Year
Venue
DocType
2021
EMNLP
Conference
Volume
Citations 
PageRank 
2021.emnlp-main
0
0.34
References 
Authors
0
10
Name
Order
Citations
PageRank
Baojun Wang100.34
Zhao Zhang200.34
Kun Xu300.34
Guang-Yuan Hao400.68
Yuyang Zhang548.85
Lifeng Shang648530.96
Linlin Li701.01
Xiao Chen800.34
Xin Jiang915032.43
Qun Liu102149203.11