Title
A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling.
Abstract
AGEL-SVM is an extension to a kernel Support Vector Machine (SVM) and is designed for distributed computing using Approximate Global Exhaustive Local sampling (AGEL)-SVM. The dual form of SVM is typically solved using sequential minimal optimization (SMO) which iterates very fast if the full kernel matrix can fit in a computeru0027s memory. AGEL-SVM aims to partition the feature space into sub problems such that the kernel matrix per problem can fit in memory by approximating the data outside each partition. AGEL-SVM has similar Cohenu0027s Kappa and accuracy metrics as the underlying SMO implementation. AGEL-SVMu0027s training times greatly decreased when running on a 128 worker MATLAB pool on Amazonu0027s EC2. Predictor evaluation times are also faster due to a reduction in support vectors per partition.
Year
Venue
Field
2017
BDCAT
Kernel (linear algebra),Data mining,Feature vector,MATLAB,Computer science,Support vector machine,Algorithm,Sampling (statistics),Sequential minimal optimization,Partition (number theory),Iterated function
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
3
4
Name
Order
Citations
PageRank
Benjamin Bryant100.34
Hamed Sari-Sarraf210918.47
L. Rodney Long353456.98
Sameer Antani41402134.03