Title
Neural Machine Translation With Sentence-Level Topic Context
Abstract
Traditional neural machine translation NMT methods use the word-level context to predict target language translation while neglecting the sentence-level context, which has been shown to be beneficial for translation prediction in statistical machine translation. This paper represents the sentence-level context as latent topic representations by using a convolution neural network, and designs a topic attention to integrate source sentence-level topic context information into both attention-based and Transformer-based NMT. In particular, our method can improve the performance of NMT by modeling source topics and translations jointly. Experiments on the large-scale LDC Chinese-to-English translation tasks and WMT’14 English-to-German translation tasks show that the proposed approach can achieve significant improvements compared with baseline systems.
Year
DOI
Venue
2019
10.1109/TASLP.2019.2937190
IEEE/ACM Transactions on Audio, Speech and Language Processing (TASLP)
Keywords
Field
DocType
Task analysis,Speech processing,Decoding,Computer architecture,Convolution,Feature extraction,Context modeling
Language translation,Convolutional neural network,Computer science,Machine translation,Speech recognition,Sentence
Journal
Volume
Issue
ISSN
27
12
2329-9290
Citations 
PageRank 
References 
4
0.40
14
Authors
5
Name
Order
Citations
PageRank
Kehai Chen14316.34
Rui Wang2192.68
Masao Utiyama371486.69
Eiichiro SUMITA41466190.87
Tiejun Zhao5643102.68