Abstract | ||
---|---|---|
There are always varieties of inherent relational structures in the observations, which is crucial to perform multi-output regression task for high-dimensional data. Therefore, this paper proposes a new multi-output regression method, simultaneously taking into account three kinds of relational structures, (i.e. ), the relationships between output and output, feature and output, sample and sample. Specially, the paper seeks the correlation of output variables by using a low-rank constraint, finds the correlation between features and outputs by imposing an (ell _{2,1})-norm regularization on coefficient matrix to conduct feature selection, and discovers the correlation of samples by designing the (ell _{2,1})-norm on the loss function to conduct sample selection. Furthermore, an effective iterative optimization algorithm is proposed to settle the convex objective function but not smooth problem. Finally, experimental results on many real datasets showed the proposed method outperforms all comparison algorithms in aspect of aCC and aRMSE. |
Year | Venue | Field |
---|---|---|
2016 | ADMA | Data mining,Coefficient matrix,Feature selection,Regression,Computer science,Algorithm,Regular polygon,Correlation,Regularization (mathematics),Optimization algorithm,Sample selection |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
27 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shichao Zhang | 1 | 2777 | 164.25 |
Lifeng Yang | 2 | 33 | 2.07 |
Yonggang Li | 3 | 42 | 6.60 |
Yan Luo | 4 | 5 | 3.23 |
Xiaofeng Zhu | 5 | 1960 | 81.85 |