Title
Improved binary classification performance using an information theoretic criterion
Abstract
Feedforward neural networks trained to solve classification problems define an approximation of the conditional probabilities P(Ci,|x) if the output units correspond to categories Ci. The present paper shows that if a least mean squared error cost function is minimised during training phase, the resulting approximation of the P(Ci|x)s is poor in the ranges of the input variable x where the conditional probabilities take on very low values. The use of the Kullback-Leibler distance measure is proposed to overcome this limitation; a cost function derived from this information theoretic measure is defined and a computationally light training procedure is derived in the case of binary classification problems. The effectiveness of the proposed procedure is verified by means of comparative experiments.
Year
DOI
Venue
1996
10.1016/0925-2312(96)00025-2
Neurocomputing
Keywords
Field
DocType
Feedforward neural networks,Classification,Kullback-Leibler distance
Feedforward neural network,Conditional probability,Binary classification,Pattern recognition,Mean squared error,Artificial intelligence,Machine learning,Kullback–Leibler divergence,Mathematics
Journal
Volume
Issue
ISSN
13
2-4
0925-2312
Citations 
PageRank 
References 
1
0.41
9
Authors
2
Name
Order
Citations
PageRank
Pietro Burrascano1214.77
Dario Pirollo220.77