Title
Logographic Subword Model for Neural Machine Translation.
Abstract
A novel logographic subword model is proposed to reinterpret logograms as abstract subwords for neural machine translation. Our approach drastically reduces the size of an artificial neural network, while maintaining comparable BLEU scores as those attained with the baseline RNN and CNN seq2seq models. The smaller model size also leads to shorter training and inference time. Experiments demonstrate that in the tasks of English-Chinese/Chinese-English translation, the reduction of those aspects can be from $11%$ to as high as $77%$. Compared to previous subword models, abstract subwords can be applied to various logographic languages. Considering most of the logographic languages are ancient and very low resource languages, these advantages are very desirable for archaeological computational linguistic applications such as a resource-limited offline hand-held Demotic-English translator.
Year
Venue
Field
2018
arXiv: Computation and Language
Inference,Computer science,Machine translation,Natural language processing,Artificial intelligence,Artificial neural network
DocType
Volume
Citations 
Journal
abs/1809.02592
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Yihao Fang100.68
Rong Zheng2245.58
Xiaodan Zhu3138773.09