Title
Accelerated Gradient Methods for Networked Optimization
Abstract
We develop multi-step gradient methods for network-constrained optimization of strongly convex functions with Lipschitz-continuous gradients. Given the topology of the underlying network and bounds on the Hessian of the objective function, we determine the algorithm parameters that guarantee the fastest convergence and characterize situations when significant speed-ups can be obtained over the standard gradient method. Furthermore, we quantify how the performance of the gradient method and its accelerated counterpart are affected by uncertainty in the problem data, and conclude that in most cases our proposed method outperforms gradient descent. Finally, we apply the proposed technique to three engineering problems: resource allocation under network-wide budget constraints, distributed averaging, and Internet congestion control. In all cases, we demonstrate that our algorithm converges more rapidly than alternative algorithms reported in the literature.
Year
DOI
Venue
2012
10.1109/TSP.2013.2278149
american control conference
Keywords
Field
DocType
Hessian matrices,gradient methods,optimisation,Hessian bounds,Internet congestion control,accelerated gradient algorithm,accelerated gradient method,consensus problems,convergence rates,distributed resource allocation,networked optimization,optimal algorithm parameter
Gradient method,Convergence (routing),Mathematical optimization,Algorithm design,Upper and lower bounds,Computer science,Control theory,Hessian matrix,Resource allocation,Rate of convergence,Acceleration
Journal
Volume
Issue
ISSN
abs/1211.2132
21
1053-587X
Citations 
PageRank 
References 
5
0.64
7
Authors
3
Name
Order
Citations
PageRank
Euhanna Ghadimi127513.75
Iman Shames263348.29
mikael johansson31612147.94