Title
The Research of Multi-Label <tex>$k$</tex>-Nearest Neighbor Based on Descending Dimension
Abstract
With the in-depth research of data classification, multi-label classification has become a hot issue of research. Multi-label k-nearest neighbor (ML- k NN) is a classification method which predicts the unclassified instances' labels by learning the classified instances. However, this method doesn't consider the interrelationships between attributes and labels. Considering the relationships between properties and labels can improve accuracy of classification methods, but the diversities of properties and labels will present the curse of dimensionality. This problem make such methods can not be expanded under the background of big data. To solve this problem, this paper proposes three methods, called multi-label k-nearest neighbor based on principal component analysis(PML- kNN), coupled similarity multi-label k-nearest neighbor based on principal component analysis(PCSML- kNN) and coupled similarity multi-label k-nearest neighbor classification based on feature selection (FCSML- kNN), which use feature extraction and feature selection to reduce the dimensions of labels' properties. We test the ML- kNN and the three methods we proposed with two real data, the experimental results show that reduce the dimensions of labels' properties can improve the efficiency of classification methods.
Year
DOI
Venue
2018
10.1109/SERA.2018.8477210
2018 IEEE 16th International Conference on Software Engineering Research, Management and Applications (SERA)
Keywords
Field
DocType
Multi-label,classification,ML-kNN,Descending Dimension
k-nearest neighbors algorithm,Data mining,Decision tree,Combinatorics,Feature selection,Computer science,Feature extraction,Curse of dimensionality,Data classification,Statistical classification,Principal component analysis
Conference
ISBN
Citations 
PageRank 
978-1-5386-5887-1
0
0.34
References 
Authors
5
4
Name
Order
Citations
PageRank
Song Gao1245.20
Xiaodan Yang200.34
Lihua Zhou3187.71
Shaowen Yao48626.85