Title
Divergence-based fine pruning of phrase-based statistical translation model.
Abstract
Entropy-based pruning has a limit in selecting a fine distribution of phrase pairs to be pruned in a threshold.Changing the distribution through other divergence metrics improves pruning efficiency in our preliminary empirical analysis.Derived problematic factors are fixed divergence distribution and missing impact of word-coupling strength.We propose a fine pruning method using two parameters to control the factors and analyze their effects to divergence change.It improves pruning efficiency compared with Entropy-based pruning in practical translations of English, Spanish, and French. A widely used automatic translation approach, phrase-based statistical machine translation, learns a probabilistic translation model composed of phrases from a large parallel corpus with a large language model. The translation model is often enormous because of many combinations of source and target phrases, which leads to the restriction of applications to limited computing environments. Entropy-based pruning resolves this issue by reducing the model size while retaining the translation quality. To safely reduce the size, this method detects redundant components by evaluating a relative entropy of models before and after pruning the components. In the literature, this method is effective, but we have observed that it can be improved more by adjusting the divergence distribution determined by the relative entropy. In the results of preliminary experiments, we derive two factors responsible for limiting pruning efficiency of entropy-based pruning. The first factor is proportion of pairs composing translation models with respect to their translation probability and its estimate. The second factor is the exponential increase of the divergence for pairs with low translation probability and estimate. To control the factors, we propose a divergence-based fine pruning using a divergence metric to adapt the curvature change of the boundary conditions for pruning and Laplace smoothing. In practical translation tasks for English-Spanish and English-French language pairs, this method shows statistically significant improvement on the efficiency up to 50% and average 12% more pruning compared to entropy-based pruning to show the same translation quality.
Year
DOI
Venue
2017
10.1016/j.csl.2016.06.006
Computer Speech & Language
Keywords
Field
DocType
Statistical machine translation,Model revision,Phrase table,Entropy-based pruning,Relative entropy
Killer heuristic,Machine translation,Artificial intelligence,Language model,Pruning,Pattern recognition,Principal variation search,Speech recognition,Pruning (decision trees),Machine learning,Kullback–Leibler divergence,Mathematics,Additive smoothing
Journal
Volume
Issue
ISSN
41
C
0885-2308
Citations 
PageRank 
References 
0
0.34
17
Authors
5
Name
Order
Citations
PageRank
Kangil Kim1247.24
Eun-Jin Park200.34
Jong-Hun Shin301.01
Oh-Woog Kwon41006.61
Young Kil Kim571.94