Title
Proximal Diffusion For Stochastic Costs With Non-Differentiable Regularizers
Abstract
We consider networks of agents cooperating to minimize a global objective, modeled as the aggregate sum of regularized costs that are not required to be differentiable. Since the subgradients of the individual costs cannot generally be assumed to be uniformly bounded, general distributed subgradient techniques are not applicable to these problems. We isolate the requirement of bounded subgradients into the regularizer and use splitting techniques to develop a stochastic proximal diffusion strategy for solving the optimization problem by continuously learning from streaming data. We represent the implementation as the cascade of three operators and invoke Banach's fixed-point theorem to establish that, despite gradient noise, the stochastic implementation is able to converge in the mean-squareerror sense within O(mu) from the optimal solution, for a sufficiently small step-size parameter, mu.
Year
Venue
Keywords
2015
2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP)
Distributed optimization, diffusion strategy, proximal operator, gradient noise, fixed point, regularization
Field
DocType
ISSN
Stochastic optimization,Mathematical optimization,Subgradient method,Computer science,Stochastic process,Uniform boundedness,Differentiable function,Operator (computer programming),Optimization problem,Bounded function
Conference
1520-6149
Citations 
PageRank 
References 
2
0.38
15
Authors
2
Name
Order
Citations
PageRank
Stefan Vlaski12311.39
Ali H. Sayed29134667.71