Title
Recurrent Neural Networks with Pre-trained Language Model Embedding for Slot Filling Task.
Abstract
In recent years, Recurrent Neural Networks (RNNs) based models have been applied to the Slot Filling problem of Spoken Language Understanding and achieved the state-of-the-art performances. In this paper, we investigate the effect of incorporating pre-trained language models into RNN based Slot Filling models. Our evaluation on the Airline Travel Information System (ATIS) data corpus shows that we can significantly reduce the size of labeled training data and achieve the same level of Slot Filling performance by incorporating extra word embedding and language model embedding layers pre-trained on unlabeled corpora.
Year
Venue
DocType
2018
arXiv: Computation and Language
Journal
Volume
Citations 
PageRank 
abs/1812.05199
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Liang Qiu102.37
Yuanyi Ding200.34
Lei He385467.22