Abstract | ||
---|---|---|
We extend the notion of general dimension, a combinatorial characterization of learning complexity for arbitrary query protocols, to encompass approximate learning. This immediately yields a characterization of the learning complexity in the statistical query model. As a further application, we consider approximate learning of DNF formulas and we derive close upper and lower bounds on the number of statistical queries needed. In particular, we show that with respect to the uniform distribution, and for any constant error parameter 驴 n variables and s terms) with tolerance 驴 = 驴(1/s) is n驴(log s). |
Year | DOI | Venue |
---|---|---|
2002 | 10.1007/3-540-36169-3_13 | ALT |
Keywords | Field | DocType |
dnf formula,statistical query model,n variable,constant error parameter,learning boolean functions,general dimension,lower bound,statistical query,combinatorial characterization,arbitrary query protocol,approximate learning,uniform distribution,upper and lower bounds,boolean function | Boolean function,Discrete mathematics,Concept class,Upper and lower bounds,Uniform distribution (continuous),Mathematics,Learning complexity | Conference |
Volume | ISSN | ISBN |
2533 | 0302-9743 | 3-540-00170-0 |
Citations | PageRank | References |
3 | 0.42 | 15 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Johannes Köbler | 1 | 580 | 46.51 |
Wolfgang Lindner | 2 | 23 | 3.38 |