Title
Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation.
Abstract
We aim to better exploit the limited amounts of parallel text available in low-resource settings by introducing a differentiable reconstruction loss for neural machine translation (NMT). We reconstruct the input from sampled translations and leverage differentiable sampling and bi-directional NMT to build a compact model that can be trained end-to-end. This approach achieves small but consistent BLEU improvements on four language pairs in both translation directions, and outperforms an alternative differentiable reconstruction strategy based on hidden states.
Year
DOI
Venue
2018
10.18653/v1/n19-1043
north american chapter of the association for computational linguistics
Field
DocType
Volume
BLEU,Computer science,Machine translation,Exploit,Differentiable function,Artificial intelligence,Sampling (statistics),Machine learning
Journal
abs/1811.01116
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Xing Niu113510.15
Weijia Xu205.75
Marine Carpuat358751.99