Abstract | ||
---|---|---|
Matrix completion plays an important role in machine learning and data mining. Although a great number of algorithms have been developed for this issue, most of them can cope with only the Gaussian noise or sparse outliers. This paper focus on an intractable setting that the known entries are corrupted by Gaussian noise and sparse outliers simultaneously. Specifically, we construct a novel model with a loss function derived from the celebrated Huber function. Furthermore, an efficient optimization method is presented to solve the constructed model. The promising performance of our algorithm is demonstrated via numerous experiments on several benchmark datasets. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1007/s11042-019-08430-2 | Multimedia Tools and Applications |
Keywords | Field | DocType |
Matrix completion, Subspace learning, Sparse outliers | Computer vision,Matrix completion,Computer science,Artificial intelligence | Journal |
Volume | Issue | ISSN |
79 | 3 | 1380-7501 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Li Tang | 1 | 85 | 8.78 |
Weili Guan | 2 | 43 | 10.84 |