Title
Dispersion for Data-Driven Algorithm Design, Online Learning, and Private Optimization
Abstract
A crucial problem in modern data science is data-driven algorithm design, where the goal is to choose the best algorithm, or algorithm parameters, for a specific application domain. In practice, we often optimize over a parametric algorithm family, searching for parameters with high performance on a collection of typical problem instances. While effective in practice, these procedures generally have not come with provable guarantees. A recent line of work initiated by a seminal paper of Gupta and Roughgarden (2017) analyzes application-specific algorithm selection from a theoretical perspective. We progress this research direction in several important settings. We provide upper and lower bounds on regret for algorithm selection in online settings, where problems arrive sequentially and we must choose parameters online. We also consider differentially private algorithm selection, where the goal is to find good parameters for a set of problems without divulging too much sensitive information contained therein. We analyze several important parameterized families of algorithms, including SDP-rounding schemes for problems formulated as integer quadratic programs as well as greedy techniques for several canonical subset selection problems. The cost function that measures an algorithm's performance is often a volatile piecewise Lipschitz function of its parameters, since a small change to the parameters can lead to a cascade of different decisions made by the algorithm. We present general techniques for optimizing the sum or average of piecewise Lipschitz functions when the underlying functions satisfy a sufficient and general condition called dispersion. Intuitively, a set of piecewise Lipschitz functions is dispersed if no small region contains many of the functions' discontinuities. Using dispersion, we improve over the best-known online learning regret bounds for a variety problems, prove regret bounds for problems not previously studied, and provide matching regret lower bounds. In the private optimization setting, we show how to optimize performance while preserving privacy for several important problems, providing matching upper and lower bounds on performance loss due to privacy preservation. Though algorithm selection is our primary motivation, we believe the notion of dispersion may be of independent interest. Therefore, we present our results for the more general problem of optimizing piecewise Lipschitz functions. Finally, we uncover dispersion in domains beyond algorithm selection, namely, auction design and pricing, providing online and privacy guarantees for these problems as well.
Year
DOI
Venue
2018
10.1109/FOCS.2018.00064
2018 IEEE 59th Annual Symposium on Foundations of Computer Science (FOCS)
Keywords
Field
DocType
dispersion,algorithm design,online optimization,differentially private optimization,piecewise Lipschitz
Approximation algorithm,Discrete mathematics,Mathematical optimization,Parameterized complexity,Algorithm design,Regret,Upper and lower bounds,Computer science,Greedy algorithm,Lipschitz continuity,Piecewise
Conference
ISSN
ISBN
Citations 
1523-8288
978-1-5386-4231-3
2
PageRank 
References 
Authors
0.36
23
3
Name
Order
Citations
PageRank
Maria-Florina Balcan11445105.01
Travis Dick2235.94
Ellen Vitercik3237.23