Title
Speedup Two-Class Supervised Outlier Detection.
Abstract
Outlier detection is an important topic in the community of data mining and machine learning. In two-class supervised outlier detection, it needs to solve a large quadratic programming whose size is twice the number of samples in the training set. Thus, training two-class supervised outlier detection model is time consuming. In this paper, we show that the result of the two-class supervised outlier detection is determined by minor critical samples which are with nonzero Lagrange multipliers and the critical samples must be located near the boundary of each class. It is much faster to train the two-class supervised outlier detection on the subset which consists of critical samples. We compare three methods which could find boundary samples. The experimental results show that the nearest neighbors distribution is more suitable for finding critical samples for the two-class supervised outlier detection. The two-class supervised novelty detection could become much faster and the performance does not degrade when only critical samples are retained by nearest neighbors' distribution information.
Year
DOI
Venue
2018
10.1109/ACCESS.2018.2877701
IEEE ACCESS
Keywords
Field
DocType
Supervised outlier detection,critical sample,nearest neighbors' distribution
Training set,Anomaly detection,Novelty detection,Pattern recognition,Computer science,Lagrange multiplier,Support vector machine,Artificial intelligence,Quadratic programming,Distributed computing,Speedup
Journal
Volume
ISSN
Citations 
6
2169-3536
3
PageRank 
References 
Authors
0.36
0
4
Name
Order
Citations
PageRank
Yugen Yi19215.25
Wei Zhou2181.75
Yanjiao Shi3343.14
Jiangyan Dai4144.19