Title
Distribution Free Learning with Local Queries.
Abstract
The model of learning with emph{local membership queries} interpolates between the PAC model and the membership queries model by allowing the learner to query the label of any example that is similar to an example in the training set. This model, recently proposed and studied by Awasthi, Feldman and Kanade, aims to facilitate practical use of membership queries. continue this line of work, proving both positive and negative results in the {em distribution free} setting. We restrict to the boolean cube ${-1, 1}^n$, and say that a query is $q$-local if it is of a hamming distance $le q$ from some training example. On the positive side, we show that $1$-local queries already give an additional strength, and allow to learn a certain type of DNF formulas. On the negative side, we show that even $left(n^{0.99}right)$-local queries cannot help to learn various classes including Automata, DNFs and more. Likewise, $q$-local queries for any constant $q$ cannot help to learn Juntas, Decision Trees, Sparse Polynomials and more. Moreover, for these classes, an algorithm that uses $left(log^{0.99}(n)right)$-local queries would lead to a breakthrough in the best known running times.
Year
Venue
Field
2016
arXiv: Learning
Training set,Decision tree,Polynomial,Automaton,Hamming distance,Artificial intelligence,Machine learning,Mathematics,restrict,Cube
DocType
Volume
Citations 
Journal
abs/1603.03714
0
PageRank 
References 
Authors
0.34
11
3
Name
Order
Citations
PageRank
Galit Bary-Weisberg100.34
Amit Daniely221620.92
Shai Shalev-Shwartz33681276.32