Title
Acceleration and Averaging in Stochastic Descent Dynamics.
Abstract
We formulate and study a general family of (continuous-time) stochastic dynamics for accelerated first-order minimization of smooth convex functions. Building on an averaging formulation of accelerated mirror descent, we propose a stochastic variant in which the gradient is contaminated by noise, and study the resulting stochastic differential equation. We prove a bound on the rate of change of an energy function associated with the problem, then use it to derive estimates of convergence rates of the function values (almost surely and in expectation), both for persistent and asymptotically vanishing noise. We discuss the interaction between the parameters of the dynamics (learning rate and averaging rates) and the covariation of the noise process. In particular, we show how the asymptotic rate of covariation affects the choice of parameters and, ultimately, the convergence rate.
Year
Venue
Field
2017
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017)
Convergence (routing),Mathematical optimization,Stochastic gradient descent,Stochastic differential equation,Minification,Convex function,Rate of convergence,Acceleration,Almost surely,Mathematics
DocType
Volume
ISSN
Conference
30
1049-5258
Citations 
PageRank 
References 
2
0.38
11
Authors
2
Name
Order
Citations
PageRank
Walid Krichene110814.02
Peter L. Bartlett254821039.97