Title
Stochastic Continuous Greedy ++: When Upper and Lower Bounds Match
Abstract
In this paper, we develop Stochastic Continuous Greedy++ (SCG++), the first efficient variant of a conditional gradient method for maximizing a continuous submodular function subject to a convex constraint. Concretely, for a monotone and continuous DR-submodular function, SCG++ achieves a tight [(1 − 1/e)OPT − ] solution while using O(1/ 2) stochastic gradients and O(1/) calls to the linear optimization oracle. The best previously known algorithms either achieve a suboptimal [(1/2)OPT − ] solution with O(1/ 2) stochastic gradients or the tight [(1 − 1/e)OPT − ] solution with suboptimal O(1/ 3) stochastic gradients. We further provide an information-theoretic lower bound to showcase the necessity of O(1/ 2) stochastic oracle queries in order to achieve [(1 − 1/e)OPT − ] for monotone and DR-submodular functions. This result shows that our proposed SCG++ enjoys optimality in terms of both approximation guarantee, i.e., (1−1/e) approximation factor, and stochastic gradient evaluations, i.e., O(1/ 2) calls to the stochastic oracle. By using stochastic continuous optimization as an interface, we also show that it is possible to obtain the [(1 − 1/e)OPT − ] tight approximation guarantee for maximizing a monotone but stochastic submodular set function subject to a general matroid constraint after at most O(n 2 / 2) calls to the stochastic function value, where n is the number of elements in the ground set.
Year
Venue
Keywords
2019
NeurIPS
continuous optimization,submodular function,conditional gradient method
Field
DocType
Citations 
Mathematical optimization,Upper and lower bounds,Computer science
Conference
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Amin Karbasi145545.00
Seyed Hamed Hassani215122.04
Aryan Mokhtari321124.93
Zebang Shen4179.36