Abstract | ||
---|---|---|
Using the recently developed framework of [Daniely et al, 2014], we show that under a natural assumption on the complexity of refuting random K-SAT formulas, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of learning intersections of $\omega(\log(n))$ halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under complexity assumptions). |
Year | Venue | Field |
---|---|---|
2014 | conference on learning theory | Discrete mathematics,Artificial intelligence,Machine learning,Mathematics |
DocType | Volume | Citations |
Journal | abs/1404.3378 | 24 |
PageRank | References | Authors |
0.85 | 26 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Amit Daniely | 1 | 216 | 20.92 |
Shai Shalev-Shwartz | 2 | 3681 | 276.32 |