Title
Improving low-resource machine transliteration by using 3-way transfer learning
Abstract
Transfer learning brings improvement to machine translation by using a resource-rich language pair to pretrain the model and then adapting it to the desired language pair. However, to date, there have been few attempts to tackle machine transliteration with transfer learning. In this article, we propose a method of using source-pivot and pivot-target datasets to improve source-target machine transliteration. Our approach first bridges the source-pivot and pivot- target datasets by reducing the distance between source and pivot embeddings. Then, our model learns to translate from the pivot language to the target language. Finally, the source-target dataset is used to fine tune the model. Our experiments show that our method is superior to the transfer learning method. When implemented with a state-of-the-art source-target translation model from NEWS'18, our transfer learning method can improve the accuracy by 1.1%.
Year
DOI
Venue
2022
10.1016/j.csl.2021.101283
COMPUTER SPEECH AND LANGUAGE
Keywords
DocType
Volume
Machine transliteration, Transfer learning, Low resource
Journal
72
ISSN
Citations 
PageRank 
0885-2308
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Chun-Kai Wu100.68
Chao-Chuang Shih200.34
Yu-Chun Wang300.68
Richard Tzong-Han Tsai4722.84