Title
Classification by nearness in complementary subspaces
Abstract
This study introduces a classifier founded on k-nearest neighbours in the complementary subspaces (NCS). The global space, spanned by all training samples, can be decomposed into the direct sum of two subspaces in terms of one class: the projection vectors of this class into one subspace are nonzero, and that into another subspace are zero. A query sample is projected into the two subspaces for each class, respectively. In each subspace, the distance from the projection vector to the mean of its k-nearest neighbours can be calculated, and the final classification rules are designed in terms of the two distances calculated in the two complementary subspaces, respectively. Allowing for the geometric meaning of Gram determinant and kernel trick, the classifier is naturally implemented in the kernel space. The experimental results on 1 synthetic, 13 IDA binary class, and five UCI multi-class data sets show that NCS compares favourably to the comparing classifiers, which is founded on the k-nearest neighbours or the nearest subspace, on almost all the data sets. The classifier can straightforwardly solve multi-classification problems, and the performance is promising.
Year
DOI
Venue
2013
10.1007/s10044-012-0308-4
Pattern Analysis & Applications
Keywords
Field
DocType
Complementary subspaces,Gram determinant,Projection vector,Kernel function
Random subspace method,Artificial intelligence,Classifier (linguistics),Vector projection,Kernel (linear algebra),Combinatorics,Subspace topology,Pattern recognition,Linear subspace,Kernel method,Mathematics,Machine learning,Kernel (statistics)
Journal
Volume
Issue
ISSN
16
4
1433-7541
Citations 
PageRank 
References 
1
0.35
19
Authors
4
Name
Order
Citations
PageRank
Menglong Yang110910.49
Yiguang Liu233837.15
Baojiang Zhong3579.80
Zheng Li410.35