Title
Black-Box Alpha Divergence Minimization.
Abstract
Black-box alpha (BB-$alpha$) is a new approximate inference method based on the minimization of $alpha$-divergences. BB-$alpha$ scales to large datasets because it can be implemented using stochastic gradient descent. BB-$alpha$ can be applied to complex probabilistic models with little effort since it only requires as input the likelihood function and its gradients. These gradients can be easily obtained using automatic differentiation. By changing the divergence parameter $alpha$, the method is able to interpolate between variational Bayes (VB) ($alpha rightarrow 0$) and an algorithm similar to expectation propagation (EP) ($alpha = 1$). Experiments on probit regression and neural network regression and classification problems show that BB-$alpha$ with non-standard settings of $alpha$, such as $alpha = 0.5$, usually produces better predictions than with $alpha rightarrow 0$ (VB) or $alpha = 1$ (EP).
Year
Venue
Field
2016
ICML
Stochastic gradient descent,Mathematical optimization,Combinatorics,Divergence,Likelihood function,Interpolation,Automatic differentiation,Approximate inference,Expectation propagation,Statistics,Mathematics,Bayes' theorem
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
José Miguel Hernández-Lobato161349.06
Yingzhen Li28211.76
Mark Rowland3112.92
Bui, Thang D.4575.77
Daniel Hernández-Lobato544026.10
Richard E. Turner632237.95