Title
Tensor completion via a multi-linear low-n-rank factorization model
Abstract
The tensor completion problem is to recover a low-n-rank tensor from a subset of its entries. The main solution strategy has been based on the extensions of trace norm for the minimization of tensor rank via convex optimization. This strategy bears the computational cost required by the singular value decomposition (SVD) which becomes increasingly expensive as the size of the underlying tensor increase. In order to reduce the computational cost, we propose a multi-linear low-n-rank factorization model and apply the nonlinear Gauss-Seidal method that only requires solving a linear least squares problem per iteration to solve this model. Numerical results show that the proposed algorithm can reliably solve a wide range of problems at least several times faster than the trace norm minimization algorithm.
Year
DOI
Venue
2014
10.1016/j.neucom.2013.11.020
Neurocomputing
Keywords
Field
DocType
trace norm,main solution strategy,underlying tensor increase,squares problem,tensor completion problem,computational cost,low-n-rank tensor,tensor rank,proposed algorithm,multi-linear low-n-rank factorization model,singular value decomposition
Rank (linear algebra),Applied mathematics,Tensor,Tensor (intrinsic definition),Artificial intelligence,Linear least squares,Singular value decomposition,Rank factorization,Mathematical optimization,Pattern recognition,Factorization,Convex optimization,Mathematics
Journal
Volume
ISSN
Citations 
133,
0925-2312
30
PageRank 
References 
Authors
0.81
14
5
Name
Order
Citations
PageRank
Huachun Tan113210.03
Bin Cheng2300.81
Wuhong Wang39413.30
Yu Jin Zhang4127293.14
Bin Ran519431.52