Title
Stochastic Training of Residual Networks: a Differential Equation Viewpoint.
Abstract
During the last few years, significant attention has been paid to the stochastic training of artificial neural networks, which is known as an effective regularization approach that helps improve the generalization capability of trained models. In this work, the method of modified equations is applied to show that the residual network and its variants with noise injection can be regarded as weak approximations of stochastic differential equations. Such observations enable us to bridge the stochastic training processes with the optimal control of backward Kolmogorovu0027s equations. This not only offers a novel perspective on the effects of regularization from the loss landscape viewpoint but also sheds light on the design of more reliable and efficient stochastic training strategies. As an example, we propose a new way to utilize Bernoulli dropout within the plain residual network architecture and conduct experiments on a real-world image classification task to substantiate our theoretical findings.
Year
Venue
DocType
2018
arXiv: Learning
Journal
Volume
Citations 
PageRank 
abs/1812.00174
2
0.38
References 
Authors
19
3
Name
Order
Citations
PageRank
Qi Sun1116.98
Yunzhe Tao221.06
Qiang Du31692188.27