Title
A Microchoice Bound for Continuous-Space Classification Algorithms
Abstract
Classifiers are often constructed iteratively by introducing changes sequentially to an initial classifier. Langford and Blum (COLT'99: Proceedings of the 12th Annual Conference on Computational Learning Theory, 1999, San Mateo, CA: Morgan Kaufmann, pp. 209–214) take advantage of this structure (the microchoice structure), to obtain bounds for the generalization ability of such algorithms. These bounds can be sharper than more general bounds. This paper extends the applicability of the microchoice approach to the more realistic case where the classifier space is continuous and the sequence of changes is not restricted to a pre-fixed finite set.Proving the microchoice bound in the continuous case relies on a conditioning technique that is often used in proving VC results. It is shown how this technique can be used to convert any learning algorithm over a continuous space into a family of algorithms over discrete spaces.The new continuous microchoice result is applied to obtain a bound for the generalization ability of the perceptron algorithm. The greedy nature of the perceptron algorithm, which generates new classifiers by introducing corrections based on misclassified points, is exploited to obtain a generalization bound that has an asymptotic form of O(1/\sqrt{n}), where n is the training set size.
Year
DOI
Venue
2003
10.1023/A:1025615325644
Machine Learning
Keywords
Field
DocType
microchoice,perceptron
Training set,Finite set,Computer science,Algorithm,Artificial intelligence,Computational learning theory,Statistical classification,Classifier (linguistics),Perceptron,Machine learning
Journal
Volume
Issue
ISSN
53
1-2
1573-0565
Citations 
PageRank 
References 
1
0.45
6
Authors
1
Name
Order
Citations
PageRank
Yoram Gat1152.41