Title
Valley-loss regular simplex support vector machine for robust multiclass classification
Abstract
Noise and outlier data processing are important issues to support vector machine (SVM). Although the pinball-loss SVM (Pin-SVM) and ramp-loss SVM (Ramp-SVM) are able to deal with the feature noise and outlier labels respectively, neither can handle both and promoting them from binary-classification to multiclass classification usually requires partitioning strategies. Since regular simplex support vector machine (RSSVM) has been proposed as a novel all-in-one K-classification model with clear advantages over partitioning strategies, developing a novel loss function with feature noise robustness and outlier labels insensitivity meanwhile and embedding it into the framework of RSSVM is potentially promising. In this paper, a newly proposed valley-loss regular simplex support vector machine (V-RSSVM) for robust multiclass classification is presented. Inheriting the merits of both the pinball-type loss and ramp-type loss, valley-loss enjoys not only the robustness to feature noise and outlier labels but also excellent sparseness. To train the V-RSSVM fast, a Concave–Convex Procedure (CCCP) assisted sequential minimization optimization (SMO)-type solver and a speeding up oriented initial solution strategy were developed. We also investigated the robustness, generalization error bound and sparseness of V-RSSVM in theory. Numerical results on twenty-five real-life data sets verify the effectiveness of our proposed V-RSSVM model.
Year
DOI
Venue
2021
10.1016/j.knosys.2021.106801
Knowledge-Based Systems
Keywords
DocType
Volume
Feature noise and outlier labels,Robust K-class classifier,Sparseness,Valley-loss function,Regular simplex support vector machine
Journal
216
ISSN
Citations 
PageRank 
0950-7051
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Long Tang100.68
Ying-Jie Tian2186.34
Wenjun Li310920.49
Panos M. Pardalos469898.99