Title
Gradient Boosting Learning Of Hidden Markov Models
Abstract
In this paper, we present a new training algorithm, gradient boosting learning, for Gaussian mixture density (GMD) based acoustic models. This algorithm is based on a function approximation scheme from the perspective of optimization in function space rather than parameter space, i.e., stage-wise additive expansions of GMDs are used to search for optimal models instead of gradient descent optimization of model parameters. In the proposed approach, GMD starts from a single Gaussian and is built up by sequentially adding new components. Each new component is globally selected to produce optimal gain in the objective function. MLE and MMI are unified under the H-criterion, which is optimized by the extended BW (EBW) algorithm. A partial extended EM algorithm is developed for stage-wise optimization of new components. Experimental results on WSJ task demonstrate that the new algorithm leads to improved model quality and recognition performance.
Year
DOI
Venue
2006
10.1109/ICASSP.2006.1660233
2006 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-13
Keywords
Field
DocType
em algorithm,boosting,training data,gradient descent,acoustics,gaussian processes,parameter space,computer science,speech recognition,function space,maximum likelihood estimation,objective function,hidden markov models,approximation algorithms,hidden markov model,function approximation,mutual information,error correction
Approximation algorithm,Mathematical optimization,Gradient descent,Pattern recognition,Function approximation,Computer science,Expectation–maximization algorithm,Artificial intelligence,Boosting (machine learning),Gaussian process,Hidden Markov model,Gradient boosting
Conference
ISSN
Citations 
PageRank 
1520-6149
2
0.41
References 
Authors
8
3
Name
Order
Citations
PageRank
Rusheng Hu1212.54
Xiaolong Li236236.92
Yunxin Zhao3807121.74