Abstract | ||
---|---|---|
The paper deals with the task of robust nonlinear regression in the presence of outliers. The problem is dealt in the context of reproducing kernel Hilbert spaces (RKHS). In contrast to more classical approaches, a recent. trend is to model the outliers as a sparse vector noise component and mobilize tools from the sparsity-aware/compressed sensing theory to impose sparsity on it. In this paper, three of the most popular approaches are considered and compared. These represent three major directions in sparsity-aware learning context; that is, a) a greedy approach 19 a convex relaxation of the sparsity-promoting task via the l(1) norm-based regularization of the least-squares cost and c) a Bayesian approach making use of appropriate priors, associated with the involved parameters. |
Year | Venue | Keywords |
---|---|---|
2015 | European Signal Processing Conference | Robust regression in RKHS,learning with kernels,kernel greedy algorithm for robust denoising - (KGARD),robust non-linear regression |
Field | DocType | ISSN |
Kernel (linear algebra),Principal component regression,Robustness (computer science),Robust regression,Kernel principal component analysis,Regularization (mathematics),Artificial intelligence,Mathematics,Compressed sensing,Machine learning,Reproducing kernel Hilbert space | Conference | 2076-1465 |
Citations | PageRank | References |
0 | 0.34 | 11 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
George Papageorgiou | 1 | 10 | 3.21 |
Pantelis Bouboulis | 2 | 171 | 11.05 |
Sergios Theodoridis | 3 | 1353 | 106.97 |