Abstract | ||
---|---|---|
The probabilistic classification vector machine (PCVM) synthesizes the advantages of both the support vector machine and the relevant vector machine, delivering a sparse Bayesian solution to classification problems. However, the PCVM is currently only applicable to binary cases. Extending the PCVM to multiclass cases via heuristic voting strategies such as one-vs-rest or one-vs-one often results in a dilemma where classifiers make contradictory predictions, and those strategies might lose the benefits of probabilistic outputs. To overcome this problem, we extend the PCVM and propose a multiclass PCVM (mPCVM). Two learning algorithms, i.e., one top-down algorithm and one bottom-up algorithm, have been implemented in the mPCVM. The top-down algorithm obtains the maximum
<italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">a posteriori</italic>
(MAP) point estimates of the parameters based on an expectation–maximization algorithm, and the bottom-up algorithm is an incremental paradigm by maximizing the marginal likelihood. The superior performance of the mPCVMs, especially when the investigated problem has a large number of classes, is extensively evaluated on the synthetic and benchmark data sets. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/TNNLS.2019.2947309 | IEEE Transactions on Neural Networks and Learning Systems |
Keywords | DocType | Volume |
Support vector machines,Probabilistic logic,Training,Bayes methods,Prediction algorithms,Learning systems,Acceleration | Journal | 31 |
Issue | ISSN | Citations |
10 | 2162-237X | 0 |
PageRank | References | Authors |
0.34 | 8 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shengfei Lyu | 1 | 0 | 1.35 |
Xing Tian | 2 | 16 | 3.27 |
Yang Li | 3 | 2 | 1.03 |
Bingbing Jiang | 4 | 21 | 8.50 |
Chen H. | 5 | 516 | 45.40 |