Title
A Random Monotone Operator Framework For Strongly Convex Stochastic Optimization
Abstract
Analysis of every algorithm for stochastic optimization seems to require a different convergence proof. It would be desirable to have a unified mathematical framework within which with minimal extra effort, proof of convergence and its rate could be obtained. We present a random monotone operator-based unified convergence analysis framework for iterative algorithms for strongly convex stochastic optimization. The framework offers both versatility and simplicity, and allows for clean and straightforward analysis of many algorithms for stochatic convex minimization, saddle-point problems and variational inequalities. We show convergence of the random operator to a probabilistic fixed point, and obtain non-asymptotic rates of convergence. The analysis technique relies on a novel stochastic dominance argument.
Year
Venue
Field
2017
2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC)
Convergence (routing),Stochastic optimization,Mathematical optimization,Computer science,Iterative method,Stochastic dominance,Convex function,Fixed point,Convex optimization,Variational inequality
DocType
ISSN
Citations 
Conference
0743-1546
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
William B. Haskell15812.04
Rahul Jain2656.67