Title
Distributed One-Class Support Vector Machine
Abstract
This paper presents a novel distributed one-class classification approach based on an extension of the nu-SVM method, thus permitting its application to Big Data data sets. In our method we will consider several one-class classifiers, each one determined using a given local data partition on a processor, and the goal is to find a global model. The cornerstone of this method is the novel mathematical formulation that makes the optimization problem separable whilst avoiding some data points considered as outliers in the final solution. This is particularly interesting and important because the decision region generated by the method will be unaffected by the position of the outliers and the form of the data will fit more precisely. Another interesting property is that, although built in parallel, the classifiers exchange data during learning in order to improve their individual specialization. Experimental results using different datasets demonstrate the good performance in accuracy of the decision regions of the proposed method in comparison with other well-known classifiers while saving training time due to its distributed nature.
Year
DOI
Venue
2015
10.1142/S012906571550029X
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS
Keywords
Field
DocType
Support vector machines, one-class classification, distributed learning, outlier detection
Anomaly detection,Data mining,One-class classification,Computer science,Random subspace method,Artificial intelligence,Optimization problem,Data point,Pattern recognition,Support vector machine,Outlier,Big data,Machine learning
Journal
Volume
Issue
ISSN
25
7
0129-0657
Citations 
PageRank 
References 
24
0.99
23
Authors
4
Name
Order
Citations
PageRank
Enrique Castillo155559.86
Diego Peteiro-Barral2709.07
Bertha Guijarro-Berdiñas329634.36
Oscar Fontenla-Romero433739.49