Title
Generalized Variant Support Vector Machine
Abstract
With the advancement in information technology, datasets with an enormous amount of data are available. The classification task on these datasets is more time- and memory-consuming as the number of data increases. The support vector machine (SVM), which is arguably the most popular classification technique, has disappointing performance in dealing with large datasets due to its constrained optimization problem. To deal with this challenge, the variant SVM (VSVM) has been utilized which has the fraction (1/2)b <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> in its primal objective function, where b is the bias of the desired hyperplane. The VSVM has been solved with different optimization techniques in more time- and memory-efficient fashion. However, there is no guarantee that its optimal solution is the same as the standard SVM. In this paper, we introduce the generalized VSVM (GVSVM) which has the fraction (1/2t)b <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> in its primal objective function, for a fixed positive scalar t. Further, we present the thorough theoretical insights that indicate the optimal solution of the GVSVM tends to the optimal solution of the standard SVM as t → ∞. One vital corollary is to derive a closed-form formula to obtain the bias term in the standard SVM. Such a formula obviates the need of approximating it, which is the modus operandi to date. An efficient neural network is then proposed to solve the GVSVM dual problem, which is asymptotically stable in the sense of Lyapunov and converges globally exponentially to the exact solution of the GVSVM. The proposed neural network has less complexity in architecture and needs fewer computations in each iteration in comparison to the existing neural solutions. Experiments confirm the efficacy of the proposed recurrent neural network and the proximity of the GVSVM and the standard SVM solutions with more significant values of t.
Year
DOI
Venue
2021
10.1109/TSMC.2019.2917019
IEEE Transactions on Systems, Man, and Cybernetics: Systems
Keywords
DocType
Volume
Convex programming,exponential convergence,generalized VSVM (GVSVM),recurrent neural network (RNN),support vector machine (SVM)
Journal
51
Issue
ISSN
Citations 
5
2168-2216
0
PageRank 
References 
Authors
0.34
18
3
Name
Order
Citations
PageRank
Majid Mohammadi174.49
S. Hamid Mousavi200.34
Effati Sohrab327630.31