Title
Coupled Variational Bayes via Optimization Embedding.
Abstract
Variational inference plays a vital role in learning graphical models, especially on large-scale datasets. Much of its success depends on a proper choice of auxiliary distribution class for posterior approximation. However, how to pursue an auxiliary distribution class that achieves both good approximation ability and computation efficiency remains a core challenge. In this paper, we proposed coupled variational Bayes which exploits the primal-dual view of the ELBO with the variational distribution class generated by an optimization procedure, which is termed optimization embedding. This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution. Theoretically, we establish an interesting connection to gradient flow and demonstrate the extreme flexibility of this implicit distribution family in the limit sense. Empirically, we demonstrate the effectiveness of the proposed method on multiple graphical models with either continuous or discrete latent variables comparing to state-of-the-art methods.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
proposed method,latent variables,variational bayes,gradient flow,variational inference,graphical models,end-to-end learning
Field
DocType
Volume
Mathematical optimization,Embedding,Inference,Computer science,Latent variable,Graphical model,Balanced flow,Bayes' theorem,Computation
Conference
31
ISSN
Citations 
PageRank 
1049-5258
1
0.35
References 
Authors
0
8
Name
Order
Citations
PageRank
Bo Dai123034.71
Hanjun Dai232325.71
Niao He321216.52
Weiyang Liu41019.23
Zhen Liu5405.01
Jianshu Chen688352.94
Xiao, Lin791853.00
Le Song82437159.27