Abstract | ||
---|---|---|
Support vector machine (SVM) is a widely used tool in classification problem. SVM solves a quadratic optimization problem to decide which instances of training dataset are support vectors, i.e., the necessarily informative instances to form the classifier. The support vectors are intact tuples taken from the training dataset. Releasing the SVM classifier to public use or shipping the SVM classifier to clients will disclose the private content of support vectors, violating the privacy-preservation requirement in some legal or commercial reasons. To the best of our knowledge, there has not been work extending the notion of privacy-preservation to releasing the SVM classifier. In this paper, we propose an approximation approach which post-processes the SVM classifier to protect the private content of support vectors. This approach is designed for the commonly used Gaussian radial basis function kernel. By applying this post-processor on the SVM classifier, the resulted privacy-preserving SVM classifier can be publicly released without exposing the private content of support vectors and is able to provide comparable classification accuracy to the original SVM classifier. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1109/ICDM.2008.19 | ICDM |
Keywords | Field | DocType |
svm classifier,support vector machine,privacy-preservation requirement,comparable classification accuracy,support vector,private content,training dataset,original svm classifier,classification problem,approximation approach,quadratic optimization,testing,kernel,training data,vectors,support vector machines,data mining,radial basis function | Structured support vector machine,Data mining,Ranking SVM,Computer science,Artificial intelligence,Classifier (linguistics),Kernel (linear algebra),Pattern recognition,Tuple,Support vector machine,Margin classifier,Machine learning,Quadratic classifier | Conference |
ISSN | Citations | PageRank |
1550-4786 | 7 | 0.50 |
References | Authors | |
5 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Keng-Pei Lin | 1 | 117 | 11.61 |
Ming Chen | 2 | 6507 | 1277.71 |