Title
Exploiting Global Low-Rank Structure and Local Sparsity Nature for Tensor Completion.
Abstract
In the era of data science, a huge amount of data has emerged in the form of tensors. In many applications, the collected tensor data are incomplete with missing entries, which affects the analysis process. In this paper, we investigate a new method for tensor completion, in which a low-rank tensor approximation is used to exploit the global structure of data, and sparse coding is used for elucidating the local patterns of data. Regarding the characterization of low-rank structures, a weighted nuclear norm for the tensor is introduced. Meanwhile, an orthogonal dictionary learning process is incorporated into sparse coding for more effective discovery of the local details of data. By simultaneously using the global patterns and local cues, the proposed method can effectively and efficiently recover the lost information of incomplete tensor data. The capability of the proposed method is demonstrated with several experiments on recovering MRI data and visual data, and the experimental results have shown the excellent performance of the proposed method in comparison with recent related methods.
Year
DOI
Venue
2019
10.1109/TCYB.2018.2853122
IEEE transactions on cybernetics
Keywords
Field
DocType
Tensile stress,Matrix decomposition,Encoding,Machine learning,Minimization,Dictionaries,Magnetic resonance imaging
Tensor,Neural coding,Matrix decomposition,Algorithm,Exploit,Stress (mechanics),Matrix norm,Minification,Artificial intelligence,Machine learning,Mathematics,Encoding (memory)
Journal
Volume
Issue
ISSN
49
11
2168-2275
Citations 
PageRank 
References 
3
0.38
34
Authors
7
Name
Order
Citations
PageRank
Yong Du130.72
Guoqiang Han243943.27
Yuhui Quan327021.69
Zhiwen Yu42753220.67
Hau-San Wong5100886.89
C. L. Philip Chen64022244.76
Jun Zhang746849.02