Title
Combining Statistical Language Models via the Latent Maximum Entropy Principle
Abstract
We present a unified probabilistic framework for statistical language modeling which can simultaneously incorporate various aspects of natural language, such as local word interaction, syntactic structure and semantic document information. Our approach is based on a recent statistical inference principle we have proposed--the latent maximum entropy principle--which allows relationships over hidden features to be effectively captured in a unified model. Our work extends previous research on maximum entropy methods for language modeling, which only allow observed features to be modeled. The ability to conveniently incorporate hidden variables allows us to extend the expressiveness of language models while alleviating the necessity of pre-processing the data to obtain explicitly observed features. We describe efficient algorithms for marginalization, inference and normalization in our extended models. We then use these techniques to combine two standard forms of language models: local lexical models (Markov N-gram models) and global document-level semantic models (probabilistic latent semantic analysis). Our experimental results on the Wall Street Journal corpus show that we obtain a 18.5% reduction in perplexity compared to the baseline tri-gram model with Good-Turing smoothing.
Year
DOI
Venue
2005
10.1007/s10994-005-0928-7
Machine Learning
Keywords
Field
DocType
language modeling,N-gram models,latent semantic analysis,maximum entropy,latent variables
Perplexity,Maximum-entropy Markov model,Latent variable model,Latent variable,Natural language processing,Probabilistic latent semantic analysis,Artificial intelligence,Principle of maximum entropy,Latent semantic analysis,Mathematics,Machine learning,Language model
Journal
Volume
Issue
ISSN
60
1-3
0885-6125
Citations 
PageRank 
References 
12
0.57
22
Authors
4
Name
Order
Citations
PageRank
Shaojun Wang146838.96
Dale Schuurmans22760317.49
Fuchun Peng3137885.75
Yunxin Zhao4807121.74