Title
Multi-Step Gradient Methods for Networked Optimization
Abstract
We develop multi-step gradient methods for network-constrained optimization of strongly convex functions with Lipschitz-continuous gradients. Given the topology of the underlying network and bounds on the Hessian of the objective function, we determine the algorithm parameters that guarantee the fastest convergence and characterize situations when significant speed-ups over the standard gradient method are obtained. Furthermore, we quantify how uncertainty in problem data at design-time affects the run-time performance of the gradient method and its multi-step counterpart, and conclude that in most cases the multi-step method outperforms gradient descent. Finally, we apply the proposed technique to three engineering problems: resource allocation under network-wide budget constraint, distributed averaging, and Internet congestion control. In all cases, our proposed algorithms converge significantly faster than the state-of-the art.
Year
DOI
Venue
2013
10.1109/TSP.2013.2278149
IEEE Transactions on Signal Processing
Keywords
DocType
Volume
signal processing,resource allocation,internet
Journal
61
Issue
ISSN
Citations 
21
1053-587X
4
PageRank 
References 
Authors
0.44
0
3
Name
Order
Citations
PageRank
Euhanna Ghadimi127513.75
Iman Shames263348.29
mikael johansson31612147.94