Title
A Bayesian encourages dropout.
Abstract
Dropout is one of the key techniques to prevent the learning from overfitting. It is explained that dropout works as a kind of modified L2 regularization. Here, we shed light on the dropout from Bayesian standpoint. Bayesian interpretation enables us to optimize the dropout rate, which is beneficial for learning of weight parameters and prediction after learning. The experiment result also encourages the optimization of the dropout.
Year
Venue
Field
2014
CoRR
Regularization (mathematics),Artificial intelligence,Overfitting,Machine learning,Mathematics,Bayesian probability
DocType
Volume
Citations 
Journal
abs/1412.7003
7
PageRank 
References 
Authors
1.20
6
1
Name
Order
Citations
PageRank
Shin-ichi Maeda1268.11