Title
Perceptron Based Learning with Example Dependent and Noisy Costs
Abstract
Learning algorithms from the fields of arti- ficial neural networks and machine learning, typically, do not take any costs into account or allow only costs depending on the classes of the examples that are used for learning. As an extension of class dependent costs, we consider costs that are example, i.e. fea- ture and class dependent. We derive a cost- sensitive perceptron learning rule for non- separable classes, that can be extended to multi-modal classes (DIPOL). We also derive aa approach for including example dependent costs into an arbitrary cost-insensitive learn- ing algorithm by sampling according to rood- ified probability distributions.
Year
Venue
Keywords
2003
ICML
machine learning,neural network,probability distribution
Field
DocType
Citations 
Competitive learning,Instance-based learning,Stability (learning theory),Semi-supervised learning,Pattern recognition,Computer science,Unsupervised learning,Artificial intelligence,Computational learning theory,Ensemble learning,Machine learning,Learning classifier system
Conference
10
PageRank 
References 
Authors
0.99
7
2
Name
Order
Citations
PageRank
Peter Geibel128626.62
Fritz Wysotzki245645.46