Title
Using Bidirectional Transformer-CRF for Spoken Language Understanding
Abstract
Spoken Language Understanding (SLU) is a critical component in spoken dialogue systems. It is typically composed of two tasks: intent detection (ID) and slot filling (SF). Currently, most effective models carry out these two tasks jointly and often result in better performance than separate models. However, these models usually fail to model the interaction between intent and slots and ties these two tasks only by a joint loss function. In this paper, we propose a new model based on bidirectional Transformer and introduce a padding method, enabling intent and slots to interact with each other in an effective way. A CRF layer is further added to achieve global optimization. We conduct our experiments on benchmark ATIS and Snips datasets, and results show that our model achieves state-of-the-art on both tasks.
Year
DOI
Venue
2019
10.1007/978-3-030-32233-5_11
Lecture Notes in Artificial Intelligence
Keywords
DocType
Volume
SLU,Transformer,CRF,Joint method
Conference
11838
ISSN
Citations 
PageRank 
0302-9743
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Linhao Zhang112.71
Hou-Feng Wang261153.83