Abstract | ||
---|---|---|
The parametric classifiers trained with the Bayesian rule are usually more accurate than the non-parametric classifiers such as nearest neighbors, neural network and support vector machine, when the class-conditional densities of distribution models are known except for some of their parameters and the training data is abundant. However, the parametric classifiers would perform poorly if these class-conditional densities are unknown and the assumed distribution models are inaccurate. In this paper, we propose a hybrid classification method for the data with partially known distribution models where only the distribution models of some classes are known. For this partial models case, the proposed hybrid classifier makes the best use of knowledge of known distribution models with Bayesian interference, while both purely parametric and non-parametric classifiers would lose a specific predictive capacity for classification. Theoretical proofs and experimental results show that the proposed hybrid classifier has much better performance than these purely parametric and non-parametric classifiers for the data with partial models. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/IJCNN.2014.6889782 | IJCNN |
Keywords | Field | DocType |
predictive capacity,partially known distribution models,bayes methods,class-conditional densities,pattern classification,hybrid classifier,known distribution models,bayesian rule,parametric classifier training,nonparametric classifier,hybrid classification method,bayesian interference,predictive models,gaussian distribution,neural networks,support vector machines,training data,data models | Variable-order Bayesian network,Pattern recognition,Naive Bayes classifier,Random subspace method,Computer science,Support vector machine,Parametric statistics,Artificial intelligence,Linear classifier,Probabilistic classification,Machine learning,Bayes classifier | Conference |
ISSN | Citations | PageRank |
2161-4393 | 2 | 0.38 |
References | Authors | |
0 | 4 |