Title
Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation.
Abstract
We propose a new Bayesian Neural Net formulation that affords variational inference for which the evidence lower bound is analytically tractable subject to a tight approximation. We achieve this tractability by (i) decomposing ReLU nonlinearities into the product of an identity and a Heaviside step function, (ii) introducing a separate path that decomposes the neural net expectation from its variance. We demonstrate formally that introducing separate latent binary variables to the activations allows representing the neural network likelihood as a chain of linear operations. Performing variational inference on this construction enables a sampling-free computation of the evidence lower bound which is a more effective approximation than the widely applied Monte Carlo sampling and CLT related techniques. We evaluate the model on a range of regression and classification tasks against BNN inference alternatives, showing competitive or improved performance over the current state-of-the-art.
Year
Venue
Field
2019
UAI
Computer science,Inference,Artificial intelligence,Bayesian neural networks,Sampling (statistics),Backpropagation,Machine learning
DocType
Citations 
PageRank 
Conference
1
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Manuel Haussmann152.78
Fred A. Hamprecht296276.24
Melih Kandemir318216.91