Title
Unsupervised model adaptation using information-theoretic criterion
Abstract
In this paper we propose a novel general framework for unsupervised model adaptation. Our method is based on entropy which has been used previously as a regularizer in semi-supervised learning. This technique includes another term which measures the stability of posteriors w.r.t model parameters, in addition to conditional entropy. The idea is to use parameters which result in both low conditional entropy and also stable decision rules. As an application, we demonstrate how this framework can be used for adjusting language model interpolation weight for speech recognition task to adapt from Broadcast news data to MIT lecture data. We show how the new technique can obtain comparable performance to completely supervised estimation of interpolation parameters.
Year
Venue
Keywords
2010
HLT-NAACL
broadcast news data,unsupervised model adaptation,low conditional entropy,mit lecture data,language model interpolation weight,information-theoretic criterion,conditional entropy,model parameter,interpolation parameter,novel general framework,new technique
Field
DocType
ISBN
Decision rule,Broadcasting,Computer science,Interpolation,Artificial intelligence,Conditional entropy,Machine learning,Language model
Conference
1-932432-65-5
Citations 
PageRank 
References 
3
0.44
7
Authors
4
Name
Order
Citations
PageRank
Ariya Rastrow124323.49
Frederick Jelinek213923.22
Abhinav Sethy336331.16
Bhuvana Ramabhadran41779153.83