Title
Modeling Recurrence for Transformer.
Abstract
Recently, the Transformer model that is based solely on attention mechanisms, has advanced the state-of-the-art on various machine translation tasks. However, recent studies reveal that the lack of recurrence hinders its further improvement of translation capacity. In response to this problem, we propose to directly model recurrence for Transformer with an additional recurrence encoder. In addition to the standard recurrent neural network, we introduce a novel attentive recurrent network to leverage the strengths of both attention and recurrent networks. Experimental results on the widely-used WMT14 English-German and WMT17 Chinese-English translation tasks demonstrate the effectiveness of the proposed approach. Our studies also reveal that the proposed model benefits from a short-cut that bridges the source and target sequences with a single recurrent layer, which outperforms its deep counterpart.
Year
DOI
Venue
2019
10.18653/v1/n19-1122
North American Chapter of the Association for Computational Linguistics
Field
DocType
Volume
Computer science,Machine translation,Transformer,Recurrent neural network,Artificial intelligence,Encoder,Machine learning
Journal
abs/1904.03092
Citations 
PageRank 
References 
2
0.37
0
Authors
6
Name
Order
Citations
PageRank
Jie Hao117538.33
Xing Wang25810.07
Baosong Yang323.75
Longyue Wang47218.24
Jinfeng Zhang521.05
Zhaopeng Tu651839.95