Title
Joint Embedding Of Words And Labels For Text Classification
Abstract
Word embeddings are effective intermediate representations for capturing semantic regularities between words, when learning the representations of text sequences. We propose to view text classification as a label-word joint embedding problem: each label is embedded in the same space with the word vectors. We introduce an attention framework that measures the compatibility of embeddings between text sequences and labels. The attention is learned on a training set of labeled samples to ensure that, given a text sequence, the relevant words are weighted higher than the irrelevant ones. Our method maintains the interpretability of word embeddings, and enjoys a built-in ability to leverage alternative sources of information, in addition to input text sequences. Extensive results on the several large text datasets show that the proposed framework outperforms the state-of-the-art methods by a large margin, in terms of both accuracy and speed.
Year
DOI
Venue
2018
10.18653/v1/p18-1216
PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1
Field
DocType
Volume
Embedding problem,Training set,Interpretability,Embedding,Computer science,Artificial intelligence,Natural language processing,Machine learning
Journal
abs/1805.04174
Citations 
PageRank 
References 
7
0.41
25
Authors
8
Name
Order
Citations
PageRank
Guoyin Wang1247.38
Chunyuan Li246733.86
Wenlin Wang3517.06
yizhe zhang413819.29
Dinghan Shen510810.37
Xinyuan Zhang6103.16
Ricardo Henao728623.85
L. Carin84603339.36