Oversampling Method for Imbalanced Classification
Keywords:
Classification, imbalanced dataset, oversampling, SMOTE, SNOCCAbstract
Classification problem for imbalanced datasets is pervasive in a lot of data mining domains. Imbalanced classification has been a hot topic in the academic community. From data level to algorithm level, a lot of solutions have been proposed to tackle the problems resulted from imbalanced datasets. SMOTE is the most popular data-level method and a lot of derivations based on it are developed to alleviate the problem of class imbalance. Our investigation indicates that there are severe flaws in SMOTE. We propose a new oversampling method SNOCC that can compensate the defects of SMOTE. In SNOCC, we increase the number of seed samples and that renders the new samples not confine in the line segment between two seed samples in SMOTE. We employ a novel algorithm to find the nearest neighbors of samples, which is different to the previous ones. These two improvements make the new samples created by SNOCC naturally reproduce the distribution of original seed samples. Our experiment results show that SNOCC outperform SMOTE and CBSO (a SMOTE-based method).Downloads
Download data is not yet available.
Downloads
Published
2016-03-01
How to Cite
Zheng, Z., Cai, Y., & Li, Y. (2016). Oversampling Method for Imbalanced Classification. Computing and Informatics, 34(5), 1017–1037. Retrieved from https://www.cai.sk/ojs/index.php/cai/article/view/1277
Issue
Section
Articles