Title
Quantifying the Value of Constructive Induction, Knowledge, and Noise Filtering on Inductive Learning
Abstract
Machine learning research, as one of its central goals, tries to measure, model, and understand how learning-problem properties effect average-case learning performance. For example, we would like to quantify the value of constructive induction, noise filtering, and background knowledge. This paper works toward this goal by combining psychology's mathematical learning theory with computational learning theory. The paper defines the effective dimension, a new learning measure that empirically links problem properties to learning performance. Like the Vapnik-Chervonenkis (VC) dimension, the effective dimension is often in a simple linear relation with problem properties. Also, it can be used to make verifiable predictions about learning performance. Unlike the VC dimension, the effective dimension is estimated empirically; its predictions are average case. It is therefore more widely applicable to machine-and human-learning research. The measure has been applied to several learning systems including Backpropagation. It has also been used to precisely predict the benefit of using FRINGE, a feature construction system. (The benefit was found to decrease as the complexity of the target concept increases.)
Year
DOI
Venue
1991
10.1016/B978-1-55860-200-7.50034-9
International Conference on Machine Learning
Keywords
Field
DocType
vc dimension
Algorithmic learning theory,Semi-supervised learning,Multi-task learning,Instance-based learning,Stability (learning theory),Active learning (machine learning),Computer science,Artificial intelligence,Computational learning theory,Machine learning,Sample exclusion dimension
Conference
Issue
Citations 
PageRank 
1
1
0.36
References 
Authors
7
1
Name
Order
Citations
PageRank
Carl Myers Kadie11948196.06