Title
Practical Bayesian Learning of Neural Networks via Adaptive Subgradient Methods.
Abstract
We introduce a novel framework for the estimation of the posterior distribution of the weights of a neural network, based on a new probabilistic interpretation of adaptive subgradient algorithms such as AdaGrad and Adam. Having a confidence measure of the weights allows several shortcomings of neural networks to be addressed. In particular, the robustness of the network can be improved by performing weight pruning based on signal-to-noise ratios from the weight posterior distribution. Using the MNIST dataset, we demonstrate that the empirical performance of Badam, a particular instance of our framework based on Adam, is competitive in comparison to related Bayesian approaches such as Bayes By Backprop.
Year
Venue
DocType
2018
arXiv: Machine Learning
Journal
Volume
Citations 
PageRank 
abs/1811.03679
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Arnold Salas100.68
Stefan Zohren212.08
stephen j roberts31244174.70