Title
Factorized Asymptotic Bayesian Hidden Markov Models
Abstract
This paper addresses the issue of model selection for hidden Markov models (HMMs). We generalize factorized asymptotic Bayesian inference (FAB), which has been recently developed for model selection on independent hidden variables (i.e., mixture models), for time-dependent hidden variables. As with FAB in mixture models, FAB for HMMs is derived as an iterative lower bound maximization algorithm of a factorized information criterion (FIC). It inherits, from FAB for mixture models, several desirable properties for learning HMMs, such as asymptotic consistency of FIC with marginal log-likelihood, a shrinkage effect for hidden state selection, monotonic increase of the lower FIC bound through the iterative optimization. Further, it does not have a tunable hyper-parameter, and thus its model selection process can be fully automated. Experimental results shows that FAB outperforms states-of-the-art variational Bayesian HMM and non-parametric Bayesian HMM in terms of model selection accuracy and computational efficiency.
Year
Venue
DocType
2012
international conference on machine learning
Conference
Volume
Citations 
PageRank 
abs/1206.4679
10
0.63
References 
Authors
7
2
Name
Order
Citations
PageRank
Ryohei Fujimaki119316.93
Hayashi, Kohei215915.31