Title
Near-Optimal Sample Complexity For Convex Tensor Completion
Abstract
We study the problem of estimating a low-rank tensor when we have noisy observations of a subset of its entries. A rank-r, order-d, N x N x ... x N tensor, where r = O(1), has O(dN) free variables. On the other hand, prior to our work, the best sample complexity that was achieved in the literature is O (N-d/2), obtained by solving a tensor nuclear-norm minimization problem. In this paper, we consider the 'M-norm', an atomic norm whose atoms are rank-1 sign tensors. We also consider a generalization of the matrix max-norm to tensors, which results in a quasi-norm that we call 'max-qnorm'. We prove that solving an M-norm constrained least squares (LS) problem results in nearly optimal sample complexity for low-rank tensor completion (TC). A similar result holds for max-qnorm as well. Furthermore, we show that these bounds are nearly minimax rate-optimal. We also provide promising numerical results for max-qnorm constrained TC, showing improved recovery compared to matricization and alternating LS.
Year
DOI
Venue
2017
10.1093/imaiai/iay019
INFORMATION AND INFERENCE-A JOURNAL OF THE IMA
Keywords
Field
DocType
compressed sensing, tensor completion, matrix completion, max norm, low-rank tensor, M-norm constrained tensor completion, Rademacher complexity
Discrete mathematics,Combinatorics,Minimax,Mathematical optimization,Tensor,Free variables and bound variables,Matrix (mathematics),Rademacher complexity,Regular polygon,Matricization,Mathematics,Unit sphere
Journal
Volume
Issue
ISSN
8
3
2049-8764
Citations 
PageRank 
References 
3
0.37
27
Authors
3
Name
Order
Citations
PageRank
Navid Ghadermarzy1101.94
Yaniv Plan2117457.19
Özgür Yilmaz368551.36