Title
Efficient Decoding for Statistical Machine Translation with a Fully Expanded WFST Model
Abstract
This paper proposes a novel method to compile sta- tistical models for machine translation to achieve efficient decoding. In our method, each statistical submodel is represented by a weighted finite-s tate transducer (WFST), and all of the submodels are ex- panded into a composition model beforehand. Fur- thermore, the ambiguity of the composition model is reduced by the statistics of hypotheses while de- coding. The experimental results show that the pro- posed model representation drastically improves the efficienc y of decoding compared to the dynamic composition of the submodels, which corresponds to conventional approaches.
Year
Venue
Keywords
2004
EMNLP
machine translation
Field
DocType
Volume
Example-based machine translation,Computer science,Machine translation,Synchronous context-free grammar,Compiler,Statistical model,Transfer-based machine translation,Natural language processing,Artificial intelligence,Decoding methods,Ambiguity
Conference
W04-32
Citations 
PageRank 
References 
7
0.53
16
Authors
2
Name
Order
Citations
PageRank
Hajime Tsukada144929.46
Masaaki Nagata257377.86