Title
Chinese NER by Span-Level Self-Attention
Abstract
In this paper, we investigate how to improve Chinese named entity recognition (NER) by applying self-attention mechanism on span-level semantic representations. Specifically, we propose a model which acquires character representations through pre-trained BERT, then extracts features of each possible character-span through LSTM, estimates the semantic reference value of each span, then explicitly leverages span-level information by performing self-attention calculation among span representations. Experiments on OntoNotes 4.0 dataset have demonstrated that the proposed model achieves 79.97% F1-score, outperforming our baseline methods.
Year
DOI
Venue
2019
10.1109/CIS.2019.00023
2019 15th International Conference on Computational Intelligence and Security (CIS)
Keywords
DocType
ISBN
Chinese NER,self-attention,span-level information
Conference
978-1-7281-6093-1
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Xiaoyu Dong100.34
Xin Xin27723.24
Ping Guo360185.05