Title
On Fast Dropout and its Applicability to Recurrent Networks.
Abstract
Recurrent Neural Networks (RNNs) are rich models for the processing of sequential data. Recent work on advancing the state of the art has been focused on the optimization or modelling of RNNs, mostly motivated by adressing the problems of the vanishing and exploding gradients. The control of overfitting has seen considerably less attention. This paper contributes to that by analyzing fast dropout, a recent regularization method for generalized linear models and neural networks from a back-propagation inspired perspective. We show that fast dropout implements a quadratic form of an adaptive, per-parameter regularizer, which rewards large weights in the light of underfitting, penalizes them for overconfident predictions and vanishes at minima of an unregularized training loss. The derivatives of that regularizer are exclusively based on the training error signal. One consequence of this is the absense of a global weight attractor, which is particularly appealing for RNNs, since the dynamics are not biased towards a certain regime. We positively test the hypothesis that this improves the performance of RNNs on four musical data sets.
Year
Venue
Field
2013
international conference on learning representations
Attractor,Data set,Computer science,Quadratic form,Recurrent neural network,Maxima and minima,Regularization (mathematics),Artificial intelligence,Overfitting,Artificial neural network,Machine learning
DocType
Volume
Citations 
Journal
abs/1311.0701
15
PageRank 
References 
Authors
2.04
24
6
Name
Order
Citations
PageRank
Justin Bayer115732.38
Christian Osendorfer212513.24
daniela korhammer3152.04
Nutan Chen4266.10
Sebastian Urban5225.38
Patrick van der Smagt618824.23