Title
Prior Probability Weights And Neural Network Learning
Abstract
A class of non-logarithmic likelihood ratio is considered and is applied to learning of neural networks including hierarchical experts. Such a likelihood ratio is based on an ct-logarithm which contains the usual logarithm as a special case. This generalized logarithm is defined through a discussion of the a-divergence which includes the Kullback-Leibler number as a special case. It is found that the usage of such a generalized logarithm on the likelihood ratio is equivalent to a prior probability weight. Then, this prior weighting is derived for learning on:neural networks of expert mixtures. Both of gradient ascent maximization and EM learning are discussed. The prior weighting is understood as speed-up and stabilization on the learning.
Year
Venue
Field
1997
PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2
Neural network learning,Computer science,Artificial intelligence,Prior probability,Machine learning
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Yasuo Matsuyama16016.41
S. Furukawa200.34
T. Ikeda300.34