Abstract | ||
---|---|---|
One of the well known risks of large margin training methods, such as boosting and support vector machines (SVMs), is their sensitivity to outliers. These risks are normally mitigated by using a soft margin criterion, such as hinge loss, to reduce outlier sensitivity. In this paper, we present a more direct approach that explicitly incorporates outlier suppression in the training process. In particular, we show how outlier detection can be encoded in the large margin training principle of support vector machines. By expressing a convex relaxation of the joint training problem as a semide finite program, one can use this approach to robustly train a support vector machine while suppressing outliers. We demonstrate that our approach can yield superior results to the standard soft margin approach in the presence of outliers. |
Year | Venue | Keywords |
---|---|---|
2006 | AAAI | outlier detection,large margin training principle,support vector machine,joint training problem,robust support vector machine,large margin training method,convex outlier ablation,soft margin criterion,standard soft margin approach,training process,outlier sensitivity,direct approach |
Field | DocType | Citations |
Anomaly detection,Hinge loss,Computer science,Artificial intelligence,Direct method,Mathematical optimization,Pattern recognition,Support vector machine,Outlier,Regular polygon,Boosting (machine learning),Margin classifier,Machine learning | Conference | 78 |
PageRank | References | Authors |
4.13 | 11 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Linli Xu | 1 | 790 | 42.51 |
Koby Crammer | 2 | 5252 | 466.86 |
Dale Schuurmans | 3 | 2760 | 317.49 |