Title
CGMOS: Certainty Guided Minority OverSampling.
Abstract
Handling imbalanced datasets is a challenging problem that if not treated correctly results in reduced classification performance. Imbalanced datasets are commonly handled using minority oversampling, whereas the SMOTE algorithm is a successful oversampling algorithm with numerous extensions. SMOTE extensions do not have a theoretical guarantee during training to work better than SMOTE and in many instances their performance is data dependent. In this paper we propose a novel extension to the SMOTE algorithm with a theoretical guarantee for improved classification performance. The proposed approach considers the classification performance of both the majority and minority classes. In the proposed approach CGMOS (Certainty Guided Minority OverSampling) new data points are added by considering certainty changes in the dataset. The paper provides a proof that the proposed algorithm is guaranteed to work better than SMOTE for training data. Further experimental results on 30 real-world datasets show that CGMOS works better than existing algorithms when using 6 different classifiers.
Year
DOI
Venue
2016
10.1145/2983323.2983789
ACM International Conference on Information and Knowledge Management
Keywords
DocType
Volume
machine learning,oversampling,imbalance learning,smote,supervised learning
Conference
abs/1607.06525
Citations 
PageRank 
References 
4
0.41
22
Authors
5
Name
Order
Citations
PageRank
Xi Zhang180.89
Di Ma2324.06
Lin Gan318630.78
Shanshan Jiang412920.15
Gady Agam539143.99