Title
Learning Dialogue History for Spoken Language Understanding.
Abstract
In task-oriented dialogue systems, spoken language understanding (SLU) aims to convert users' queries expressed by natural language to structured representations. SLU usually consists of two parts, namely intent identification and slot filling. Although many methods have been proposed for SLU, these methods generally process each utterance individually, which loses context information in dialogues. In this paper, we propose a hierarchical LSTM based model for SLU. The dialogue history is memorized by a turn-level LSTM and it is used to assist the prediction of intent and slot tags. Consequently, the understanding of the current turn is dependent on the preceding turns. We conduct experiments on the NLPCC 2018 Shared Task 4 dataset. The results demonstrate that the dialogue history is effective for SLU and our model outperforms all baselines.
Year
DOI
Venue
2018
10.1007/978-3-319-99495-6_11
Lecture Notes in Artificial Intelligence
Keywords
Field
DocType
Spoken language understanding,Dialogue history,Hierarchical LSTM
Computer science,Utterance,Natural language,Natural language processing,Artificial intelligence,Spoken language
Conference
Volume
ISSN
Citations 
11108
0302-9743
0
PageRank 
References 
Authors
0.34
16
3
Name
Order
Citations
PageRank
Xiaodong Zhang1884.51
Dehong Ma2614.73
Hou-Feng Wang361153.83