Title
The Limitations of Optimization from Samples
Abstract
In this paper we consider the following question: can we optimize objective functions from the training data we use to learn them? We formalize this question through a novel framework we call optimization from samples (OPS). In OPS, we are given sampled values of a function drawn from some distribution and the objective is to optimize the function under some constraint. While there are interesting classes of functions that can be optimized from samples, our main result is an impossibility. We show that there are classes of functions which are statistically learnable and optimizable, but for which no reasonable approximation for optimization from samples is achievable. In particular, our main result shows that there is no constant factor approximation for maximizing coverage functions under a cardinality constraint using polynomially-many samples drawn from any distribution. We also show tight approximation guarantees for maximization under a cardinality constraint of several interesting classes of functions including unit-demand, additive, and general monotone submodular functions, as well as a constant factor approximation for monotone submodular functions with bounded curvature.
Year
DOI
Venue
2015
10.1145/3055399.3055406
STOC
Keywords
DocType
Volume
Optimization,PAC learning,coverage functions
Journal
abs/1512.06238
ISSN
Citations 
PageRank 
0737-8017
7
0.44
References 
Authors
43
3
Name
Order
Citations
PageRank
eric balkanski1386.13
Aviad Rubinstein217924.66
Yaron Singer351637.15