Title
Investigation on the construction of the Relevance Vector Machine based on cross entropy minimization
Abstract
As a machine learning method under sparse Bayesian framework, classical Relevance Vector Machine (RVM) applies kernel methods to construct Radial Basis Function(RBF) networks using a least number of relevant basis functions. Compared to the well-known Support Vector Machine (SVM), the RVM provides a better sparsity, and an automatic estimation of hyperparameters. However, the performance of the original RVM purely depends on the smoothness of the presumed prior of the connection weights and parameters. Consequently, the sparsity is actually still controlled by the selection of kernel functions or kernel parameters. This may lead to severe underfitting or overfitting in some cases. In the research presented in this paper, we explicitly involve the number of basis functions into the objective of the optimization procedure, and construct the RVM by the minimization of the cross entropy between the “hypothetical” probability distribution in the forward training pathway and the “true” probability distribution in the backward testing pathway. The experimental results have shown that our proposed methodology can achieve both the least complexity of structure and goodness of appropriate fit to data.
Year
DOI
Venue
2016
10.1109/IConAC.2016.7604924
2016 22nd International Conference on Automation and Computing (ICAC)
Keywords
Field
DocType
Relevance vector machine (RVM),Bayesian inference,Cross Entropy Minimization,Radial Basis Function (RBF) Network
Radial basis function kernel,Pattern recognition,Support vector machine,Probability distribution,Artificial intelligence,Basis function,Overfitting,Relevance vector machine,Kernel method,Mathematics,Kernel (statistics)
Conference
ISBN
Citations 
PageRank 
978-1-5090-2877-1
0
0.34
References 
Authors
4
4
Name
Order
Citations
PageRank
Xiaofang Liu100.68
Ruikang Li200.34
Dansong Cheng3286.42
Kai Cheng43912.36