Title
Temporal Attention Model for Neural Machine Translation.
Abstract
Attention-based Neural Machine Translation (NMT) models suffer from attention deficiency issues as has been observed in recent research. We propose a novel mechanism to address some of these limitations and improve the NMT attention. Specifically, our approach memorizes the alignments temporally (within each sentence) and modulates the attention with the accumulated temporal memory, as the decoder generates the candidate translation. We compare our approach against the baseline NMT model and two other related approaches that address this issue either explicitly or implicitly. Large-scale experiments on two language pairs show that our approach achieves better and robust gains over the baseline and related NMT approaches. Our model further outperforms strong SMT baselines in some settings even without using ensembles.
Year
Venue
Field
2016
arXiv: Computation and Language
Computer science,Machine translation,Attention model,Baseline (configuration management),Speech recognition,Artificial intelligence,Natural language processing,Sentence,Machine learning
DocType
Volume
Citations 
Journal
abs/1608.02927
9
PageRank 
References 
Authors
0.74
15
4
Name
Order
Citations
PageRank
Baskaran Sankaran115513.65
Haitao Mi247923.40
Yaser Al-Onaizan354038.51
Abe Ittycheriah431822.92