Title
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation.
Abstract
In this paper, we propose a novel neural network model called RNN Encoder-Decoder that consists of two recurrent neural networks (RNN). One RNN encodes a sequence of symbols into a fixed-length vector representation, and the other decodes the representation into another sequence of symbols. The encoder and decoder of the proposed model are jointly trained to maximize the conditional probability of a target sequence given a source sequence. The performance of a statistical machine translation system is empirically found to improve by using the conditional probabilities of phrase pairs computed by the RNN Encoder-Decoder as an additional feature in the existing log-linear model. Qualitatively, we show that the proposed model learns a semantically and syntactically meaningful representation of linguistic phrases.
Year
Venue
DocType
2014
EMNLP
Journal
Volume
Citations 
PageRank 
abs/1406.1078
1305
52.71
References 
Authors
25
7
Search Limit
1001000
Name
Order
Citations
PageRank
Kyunghyun Cho16803316.85
Bart van Merriënboer2200785.22
Çaglar Gülçehre33010133.22
Dzmitry Bahdanau42677117.03
Fethi Bougares5138667.48
Holger Schwenk62533228.83
Yoshua Bengio7426773039.83