Title
Multiclass Probabilistic Classification Vector Machine
Abstract
The probabilistic classification vector machine (PCVM) synthesizes the advantages of both the support vector machine and the relevant vector machine, delivering a sparse Bayesian solution to classification problems. However, the PCVM is currently only applicable to binary cases. Extending the PCVM to multiclass cases via heuristic voting strategies such as one-vs-rest or one-vs-one often results in a dilemma where classifiers make contradictory predictions, and those strategies might lose the benefits of probabilistic outputs. To overcome this problem, we extend the PCVM and propose a multiclass PCVM (mPCVM). Two learning algorithms, i.e., one top-down algorithm and one bottom-up algorithm, have been implemented in the mPCVM. The top-down algorithm obtains the maximum <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">a posteriori</italic> (MAP) point estimates of the parameters based on an expectation–maximization algorithm, and the bottom-up algorithm is an incremental paradigm by maximizing the marginal likelihood. The superior performance of the mPCVMs, especially when the investigated problem has a large number of classes, is extensively evaluated on the synthetic and benchmark data sets.
Year
DOI
Venue
2020
10.1109/TNNLS.2019.2947309
IEEE Transactions on Neural Networks and Learning Systems
Keywords
DocType
Volume
Support vector machines,Probabilistic logic,Training,Bayes methods,Prediction algorithms,Learning systems,Acceleration
Journal
31
Issue
ISSN
Citations 
10
2162-237X
0
PageRank 
References 
Authors
0.34
8
5
Name
Order
Citations
PageRank
Shengfei Lyu101.35
Xing Tian2163.27
Yang Li321.03
Bingbing Jiang4218.50
Chen H.551645.40