Title
Variational closed-Form deep neural net inference.
Abstract
•A novel Bayesian neural net construction allowing closed-form variational inference.•Closed-form updates are made tractable by decomposing ReLU into two components.•The resulting inference scheme fast convergence, compatible for online learning.•State-of-the-art learning curve when applied to Bayesian active learning.•Outperforms deterministic neural nets in scarce data regimes.
Year
DOI
Venue
2018
10.1016/j.patrec.2018.07.001
Pattern Recognition Letters
Keywords
Field
DocType
Bayesian Neural Networks,Variational Bayes,Online learning,Active learning
Small data,Active learning,Pattern recognition,Inference,Artificial intelligence,Expectation propagation,Deep learning,Artificial neural network,Mathematics,Bayesian probability,Bayes' theorem
Journal
Volume
ISSN
Citations 
112
0167-8655
0
PageRank 
References 
Authors
0.34
16
1
Name
Order
Citations
PageRank
Melih Kandemir118216.91