Abstract | ||
---|---|---|
Low-rank representation (LRR) can recover clean data from noisy data while effectively characterizing the subspace structures between data, therefore, it becomes one of the state-of-the-art methods for subspace learning and is widely used in machine learning, image processing, and data mining. In this paper, we propose a novel three-term low-rank tensor decomposition approach called the enhanced tensor LRR (ETLRR). In ETLRR, the original data tensor is decomposed into three parts: low-rank structure tensor, sparse noise tensor, and Gaussian noise tensor. First of all, unlike the existing LRR-related methods, which only consider one kind of Laplacian or Gaussian distribution noise, we consider that two types of noise can effectively restore a clean tensor, thereby obtaining a more accurate low-rank tensor subspace structures. Secondly, the denoised tensor rather than the original data tensor is adopted to construct the dictionary. And then, ETLRR can be implemented directly on the tensor data composed of the samples while two-dimensional data such as image samples are not converted into vectors in advance. Finally, we propose an iterative update method for the optimization of ETLRR based on the alternating direction method of multipliers (ADMM). Compared with the state-of-the-art methods, experiments on synthetic data and image clustering, image and video denoising verify the good performance of ETLRR in both obtaining the low-rank tensor subspace structures and recovering the tensor data. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1016/j.knosys.2022.108468 | Knowledge-Based Systems |
Keywords | DocType | Volume |
Low-rank representation,Tensor data clustering,Tensor data denoising,Low-rank tensor subspace | Journal | 243 |
ISSN | Citations | PageRank |
0950-7051 | 0 | 0.34 |
References | Authors | |
0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shiqiang Du | 1 | 0 | 0.34 |
Baokai Liu | 2 | 0 | 0.34 |
Guangrong Shan | 3 | 0 | 0.68 |
Yuqing Shi | 4 | 1 | 1.02 |
Weilan Wang | 5 | 9 | 11.75 |