Title
Label-Dependency Coding In Simple Recurrent Networks For Spoken Language Understanding
Abstract
Modeling target label dependencies is important for sequence labeling tasks. This may become crucial in the case of Spoken Language Understanding (SLU) applications, especially for the slot-filling task where models have to deal often with a high number of target labels. Conditional Random Fields (CRF) were previously considered as the most efficient algorithm in these conditions. More recently. different architectures of Recurrent Neural Networks (RNNs) have been proposed for the SLU slot-filling task. Most of them, however, have been successfully evaluated on the simple ATIS database, on which it is difficult to draw significant conclusions. In this paper we propose new variants of RNNs able to learn efficiently and effectively label dependencies by integrating label embeddings. We show first that modeling label dependencies is useless on the (simple) ATIS database and unstructured models can produce state-of-the-art results on this benchmark. On ATIS our new variants achieve the same results as state-of-the-art models, while being much simpler. On the other hand, on the MEDIA benchmark, we show that the modification introduced in the proposed RNN outperforms traditional RNNs and CRF models.
Year
DOI
Venue
2017
10.21437/Interspeech.2017-1480
18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION
Keywords
Field
DocType
recurrent neural networks, label dependencies, spoken language understanding, slot filling, ATIS, MEDIA
Conditional random field,Computer science,Recurrent neural network,Speech recognition,Coding (social sciences),Artificial intelligence,Spoken language,Machine learning
Conference
ISSN
Citations 
PageRank 
2308-457X
0
0.34
References 
Authors
10
3
Name
Order
Citations
PageRank
Marco Dinarelli17911.21
Vedran Vukotic2294.59
christian raymond311813.80