Title
Learning Conditional Linear Gaussian Classifiers with Probabilistic Class Labels.
Abstract
We study the problem of learning Bayesian classifiers (BC) when the true class label of the training instances is not known, and is substituted by a probability distribution over the class labels for each instance. This scenario can arise, e. g., when a group of experts is asked to individually provide a class label for each instance. We particularize the generalized expectation maximization (GEM) algorithm in [1] to learn BCs with different structural complexities: naive Bayes, averaged one-dependence estimators or general conditional linear Gaussian classifiers. An evaluation conducted on eight datasets shows that BCs learned with GEM perform better than those using either the classical Expectation Maximization algorithm or potentially wrong class labels. BCs achieve similar results to the multivariate Gaussian classifier without having to estimate the full covariance matrices.
Year
DOI
Venue
2013
10.1007/978-3-642-40643-0_15
ADVANCES IN ARTIFICIAL INTELLIGENCE, CAEPIA 2013
Keywords
Field
DocType
Bayesian classifiers,probabilistic class labels,partially supervised learning,belief functions
Naive Bayes classifier,Pattern recognition,Expectation–maximization algorithm,Random subspace method,Computer science,Multivariate normal distribution,Probability distribution,Artificial intelligence,Probabilistic logic,Classifier (linguistics),Machine learning,Bayesian probability
Conference
Volume
ISSN
Citations 
8109
0302-9743
1
PageRank 
References 
Authors
0.34
7
3
Name
Order
Citations
PageRank
Pedro L. López-Cruz1333.88
Concha Bielza290972.11
Pedro Larrañaga33882208.54