Abstract | ||
---|---|---|
Twin Support Vector Machines (TWSVMs) have emerged as an efficient alternative to Support Vector Machines (SVM) for learning from imbalanced datasets. The TWSVM learns two non-parallel classifying hyperplanes by solving a couple of smaller sized problems. However, it is unsuitable for large datasets, as it involves matrix operations. In this paper, we discuss a Twin Neural Network (Twin NN) architecture for learning from large unbalanced datasets. The objective functions of the networks in the Twin NN are designed to realize the idea of the Twin SVM with non-parallel decision boudaries for the respective classes, while also being able to reduce model complexity. The Twin NN optimizes the feature map, allowing for better discrimination between classes. The paper also discusses an extension of the Twin NN for multiclass datasets. This architecture trains as many neural networks as the number of classes, and has the additional advantage that it does not have any hyper-parameter which requires tuning. Results presented in the paper demonstrate that the Twin NN generalizes well and scales well on large unbalanced datasets. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1016/j.neucom.2018.07.089 | Neurocomputing |
Keywords | DocType | Volume |
Twin SVM,Neural network,Unbalanced datasets,Large scale learning,Skewed data | Journal | 343 |
ISSN | Citations | PageRank |
0925-2312 | 2 | 0.35 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jayadeva | 1 | 788 | 38.14 |
Himanshu Pant | 2 | 7 | 2.45 |
Mayank Sharma | 3 | 168 | 22.18 |
sumit soman | 4 | 20 | 7.53 |