Title
Learning Generic Sentence Representations Using Convolutional Neural Networks.
Abstract
We propose a new encoder-decoder approach to learn distributed sentence representations that are applicable to multiple purposes. The model is learned by using a convolutional neural network as an encoder to map an input sentence into a continuous vector, and using a long short-term memory recurrent neural network as a decoder. Several tasks are considered, including sentence reconstruction and future sentence prediction. Further, a hierarchical encoder-decoder model is proposed to encode a sentence to predict multiple future sentences. By training our models on a large collection of novels, we obtain a highly generic convolutional sentence encoder that performs well in practice. Experimental results on several benchmark datasets, and across a broad range of applications, demonstrate the superiority of the proposed model over competing methods.
Year
DOI
Venue
2017
10.18653/v1/D17-1254
empirical methods in natural language processing
Field
DocType
Volume
ENCODE,Computer science,Convolutional neural network,Recurrent neural network,Encoder,Natural language processing,Artificial intelligence,Deep learning,Sentence,Machine learning
Conference
D17-1
Citations 
PageRank 
References 
15
0.65
42
Authors
6
Name
Order
Citations
PageRank
Zhe Gan131932.58
Yunchen Pu2888.55
Ricardo Henao328623.85
Chunyuan Li446733.86
Xiaodong He53858190.28
Lawrence Carin613711.38