Abstract | ||
---|---|---|
We propose a new encoder-decoder approach to learn distributed sentence representations that are applicable to multiple purposes. The model is learned by using a convolutional neural network as an encoder to map an input sentence into a continuous vector, and using a long short-term memory recurrent neural network as a decoder. Several tasks are considered, including sentence reconstruction and future sentence prediction. Further, a hierarchical encoder-decoder model is proposed to encode a sentence to predict multiple future sentences. By training our models on a large collection of novels, we obtain a highly generic convolutional sentence encoder that performs well in practice. Experimental results on several benchmark datasets, and across a broad range of applications, demonstrate the superiority of the proposed model over competing methods. |
Year | DOI | Venue |
---|---|---|
2017 | 10.18653/v1/D17-1254 | empirical methods in natural language processing |
Field | DocType | Volume |
ENCODE,Computer science,Convolutional neural network,Recurrent neural network,Encoder,Natural language processing,Artificial intelligence,Deep learning,Sentence,Machine learning | Conference | D17-1 |
Citations | PageRank | References |
15 | 0.65 | 42 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhe Gan | 1 | 319 | 32.58 |
Yunchen Pu | 2 | 88 | 8.55 |
Ricardo Henao | 3 | 286 | 23.85 |
Chunyuan Li | 4 | 467 | 33.86 |
Xiaodong He | 5 | 3858 | 190.28 |
Lawrence Carin | 6 | 137 | 11.38 |