Title
Regularized Context Gates on Transformer for Machine Translation
Abstract
Context gates are effective to control the contributions from the source and target contexts in the recurrent neural network (RNN) based neural machine translation (NMT). However, it is challenging to extend them into the advanced Transformer architecture, which is more complicated than RNN. This paper first provides a method to identify source and target contexts and then introduce a gate mechanism to control the source and target contributions in Transformer. In addition, to further reduce the bias problem in the gate mechanism, this paper proposes a regularization method to guide the learning of the gates with supervision automatically generated using pointwise mutual information. Extensive experiments on 4 translation datasets demonstrate that the proposed model obtains an averaged gain of 1.0 BLEU score over strong Transformer baseline.
Year
Venue
DocType
2020
ACL
Conference
Volume
Citations 
PageRank 
2020.acl-main
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Xintong Li121.41
Lemao Liu28718.74
Rui Wang37618.98
Guoping Huang432.08
Max Meng501.35