Title
Code-Switching for Enhancing NMT with Pre-Specified Translation.
Abstract
Leveraging user-provided translation to constrain NMT has practical significance. Existing methods can be classified into two main categories, namely the use of placeholder tags for lexicon words and the use of hard constraints during decoding. Both methods can hurt translation fidelity for various reasons. We investigate a data augmentation method, making code-switched training data by replacing source phrases with their target translations. Our method does not change the MNT model or decoding algorithm, allowing the model to learn lexicon translations by copying source-side target words. Extensive experiments show that our method achieves consistent improvements over existing approaches, improving translation of constrained words without hurting unconstrained words.
Year
Venue
Field
2019
North American Chapter of the Association for Computational Linguistics
Training set,Fidelity,Computer science,Code-switching,Copying,Lexicon,Natural language processing,Artificial intelligence,Decoding methods
DocType
Volume
Citations 
Journal
abs/1904.09107
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Kai Song144.13
Yue Zhang21364114.17
Heng Yu3326.67
Weihua Luo4910.38
Kun Wang515045.41
Min Zhang61849157.00