Abstract | ||
---|---|---|
What kinds of functions are learnable from their satisfying assignments? Motivated by this simple question, we extend the framework of [DDS15], which studied the learnability of probability distributions over {0, 1}n defined by the set of satisfying assignments to "low-complexity" Boolean functions, to Boolean-valued functions defined over continuous domains. In our learning scenario there is a known "background distribution" D over Rn (such as a known normal distribution or a known log-concave distribution) and the learner is given i.i.d. samples drawn from a target distribution Df, where Df is D restricted to the satisfying assignments of an unknown low-complexity Boolean-valued function f. The problem is to learn an approximation D' of the target distribution Df which has small error as measured in total variation distance.
We give a range of efficient algorithms and hardness results for this problem, focusing on the case when f is a low-degree polynomial threshold function (PTF). When the background distribution D is log-concave, we show that this learning problem is efficiently solvable for degree-1 PTFs (i.e., linear threshold functions) but not for degree-2 PTFs. In contrast, when D is a normal distribution, we show that this learning problem is efficiently solvable for degree-2 PTFs but not for degree-4 PTFs. Our hardness results rely on standard assumptions about secure signature schemes.
|
Year | DOI | Venue |
---|---|---|
2020 | 10.5555/3381089.3381095 | SODA '20: ACM-SIAM Symposium on Discrete Algorithms
Salt Lake City
Utah
January, 2020 |
Field | DocType | Citations |
Discrete mathematics,Algebra,Computer science,Continuous distributions | Conference | 0 |
PageRank | References | Authors |
0.34 | 0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Clément L. Canonne | 1 | 103 | 16.16 |
Anindya De | 2 | 239 | 24.77 |
Rocco A. Servedio | 3 | 1656 | 133.28 |