Title
A unified multi-label classification framework with supervised low-dimensional embedding
Abstract
It is an important issue for multi-label classification to discover and utilize data structures or label correlations during the learning process, which could greatly improve the learning performance. In this paper, a unified framework is proposed for multi-label classification by incorporating the supervised low-dimensional embedding into the predictive model. The supervised embedding exploits latent structures and correlations from samples and labels, finds informative shared characteristics in a low-dimensional subspace and obtains a high quality dimensionality reduction. In the framework, a low-dimensional feature mapping is constructed through a linear transformation guided by the label information; meanwhile, the weights of the multi-label classifier have already been set up. The framework leads to a trace optimization problem and can be solved by a generalized eigenvalue problem. The dual form of the framework is also proposed to deal with high-dimensional cases. Experiments on ten datasets show that the proposed unified framework achieves better or comparable performance in terms of multi-label classification measures and ranking measures and needs much less training time in most cases. Furthermore, the framework is robust to the size of the low-dimensional subspace. (C) 2015 Elsevier B.V. All rights reserved.
Year
DOI
Venue
2016
10.1016/j.neucom.2015.07.087
NEUROCOMPUTING
Keywords
Field
DocType
Multi-label classification,Data structure,Label correlation,Supervised low-dimensional embedding,Dimensionality reduction,Generalized eigenvalue problem
Data structure,Embedding,Dimensionality reduction,Pattern recognition,Ranking,Subspace topology,Multi-label classification,Artificial intelligence,Classifier (linguistics),Optimization problem,Mathematics,Machine learning
Journal
Volume
ISSN
Citations 
171
0925-2312
1
PageRank 
References 
Authors
0.35
24
2
Name
Order
Citations
PageRank
Zijie Chen140.76
Zhifeng Hao265378.36