Title
Learnability and the Vapnik-Chervonenkis dimension
Abstract
Valiant's learnability model is extended to learning classes of concepts defined by regions in Euclidean space En. The methods in this paper lead to a unified treatment of some of Valiant's results, along with previous results on distribution-free convergence of certain pattern recognition algorithms. It is shown that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned. Using this parameter, the complexity and closure properties of learnable classes are analyzed, and the necessary and sufficient conditions are provided for feasible learnability.
Year
DOI
Venue
1989
10.1145/76359.76371
Journal of the ACM (JACM)
Keywords
DocType
Volume
pac learning,vapnik-chervonenkis classes,sample complexity,learnability theory,learnability model,distribution-free convergence,occam's razor,certain pattern recognition algorithm,vapnik-chervonenkis dimension,Euclidean space,distribution-free learnability,additional key words and phrases: capacity,closure property,simple combinatorial parameter,essential condition,Vapnik-Chervonenkis dimension,learning from examples,feasible learnability
Journal
36
Issue
ISSN
Citations 
4
0004-5411
793
PageRank 
References 
Authors
307.26
46
4
Search Limit
100793
Name
Order
Citations
PageRank
Anselm Blumer11227598.56
A. Ehrenfeucht21823497.83
David Haussler383273068.93
Manfred K. Warmuth461051975.48