Abstract | ||
---|---|---|
This paper has proposed a novel spectral feature selection approach by embedding two modified subspace learning methods into a sparse feature selection framework. Specifically, we use an adaptive graph matrix learning and a low-rank constraint to preserve the local and global structure of data simultaneously. Sparse learning and low-rank constraint are used for relieving the impact of noise. Furthermore, we have coupled the graph matrix learning and low-dimensional feature space learning into an unified framework, aiming at achieving the global optimization of feature selection. By analysing the results of both proposed method and comparison methods on four realword and benchmark datasets, the proposed method achieves competitive results in term of classification performance. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1109/ICBK.2017.48 | 2017 IEEE International Conference on Big Knowledge (ICBK) |
Keywords | Field | DocType |
adaptive graph matrix learning,sparsity representation,local and global preservation | Competitive learning,Dimensionality reduction,Semi-supervised learning,Instance-based learning,Pattern recognition,Active learning (machine learning),Feature selection,Unsupervised learning,Artificial intelligence,Mathematics,Feature learning,Machine learning | Conference |
ISBN | Citations | PageRank |
978-1-5386-3121-8 | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yonghua Zhu | 1 | 216 | 12.38 |
Xuejun Zhang | 2 | 70 | 16.55 |
Rongyao Hu | 3 | 243 | 14.01 |
Guoqiu Wen | 4 | 46 | 4.62 |