Title
Intent detection using semantically enriched word embeddings
Abstract
State-of-the-art targeted language understanding systems rely on deep learning methods using 1-hot word vectors or off-the-shelf word embeddings. While word embeddings can be enriched with information from semantic lexicons (such as WordNet and PPDB) to improve their semantic representation, most previous research on word-embedding enriching has focused on improving intrinsic word-level tasks such as word analogy and antonym detection. In this work, we enrich word embeddings to force semantically similar or dissimilar words to be closer or farther away in the embedding space to improve the performance of an extrinsic task, namely, intent detection for spoken language understanding. We utilize several semantic lexicons, such as WordNet, PPDB, and Macmillan Dictionary to enrich the word embeddings and later use them as initial representation of words for intent detection. Thus, we enrich embeddings outside the neural network as opposed to learning the embeddings within the network, and, on top of the embeddings, build bidirectional LSTM for intent detection. Our experiments on ATIS and a real log dataset from Microsoft Cortana show that word embeddings enriched with semantic lexicons can improve intent detection.
Year
DOI
Venue
2016
10.1109/SLT.2016.7846297
2016 IEEE Spoken Language Technology Workshop (SLT)
Keywords
Field
DocType
word embeddings,semantic lexicons,LSTM,intent detection,spoken language understanding
Computer science,Natural language processing,Artificial intelligence,Deep learning,Analogy,WordNet,Artificial neural network,Spoken language,Embedding,Pattern recognition,Speech recognition,Vocabulary,Semantics
Conference
ISSN
ISBN
Citations 
2639-5479
978-1-5090-4904-2
5
PageRank 
References 
Authors
0.47
5
5
Name
Order
Citations
PageRank
Joo-kyung Kim1615.57
Gokhan Tur293183.35
Asli Çelikyilmaz340739.06
Bin Cao48512.64
Ye-Yi Wang562559.41