Title
Scaling Factorial Hidden Markov Models: Stochastic Variational Inference without Messages.
Abstract
Factorial Hidden Markov Models (FHMMs) are powerful models for sequential data but they do not scale well with long sequences. We propose a scalable inference and learning algorithm for FHMMs that draws on ideas from the stochastic variational inference, neural network and copula literatures. Unlike existing approaches, the proposed algorithm requires no message passing procedure among latent variables and can be distributed to a network of computers to speed up learning. Our experiments corroborate that the proposed algorithm does not introduce further approximation bias compared to the proven structured mean-field algorithm, and achieves better performance with long sequences and large FHMMs.
Year
Venue
Field
2016
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016)
Inference,Copula (linguistics),Latent variable,Artificial intelligence,Artificial neural network,Scaling,Message passing,Machine learning,Mathematics,Speedup,Scalability
DocType
Volume
ISSN
Conference
29
1049-5258
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Ng, Yin Cheng110.68
Chilinski, Pawel M.200.34
Ricardo Bezerra de Andrade e Silva310924.56