Title
The Physical Systems Behind Optimization Algorithms
Abstract
We use differential equations based approaches to provide some physics insights into analyzing the dynamics of popular optimization algorithms in machine learning. In particular, we study gradient descent, proximal gradient descent, coordinate gradient descent, proximal coordinate gradient, and Newton's methods as well as their Nesterov's accelerated variants in a unified framework motivated by a natural connection of optimization algorithms to physical systems. Our analysis is applicable to more general algorithms and optimization problems beyond convexity and strong convexity, e.g. Polyak-Lojasiewicz and error bound conditions (possibly nonconvex).
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
unified framework,physical systems,strong convexity,optimization problems,gradient descent,machine learning,ordinary differential equation
DocType
Volume
ISSN
Conference
31
1049-5258
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Lin Yang13121.21
R. Arora248935.97
Vladimir Braverman335734.36
Tuo Zhao422240.58