Title
Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors.
Abstract
We consider two problems that arise in machine learning applications: the problem of recovering a planted sparse vector in a random linear subspace and the problem of decomposing a random low-rank overcomplete 3-tensor. For both problems, the best known guarantees are based on the sum-of-squares method. We develop new algorithms inspired by analyses of the sum-of-squares method. Our algorithms achieve the same or similar guarantees as sum-of-squares for these problems but the running time is significantly faster. For the planted sparse vector problem, we give an algorithm with running time nearly linear in the input size that approximately recovers a planted sparse vector with up to constant relative sparsity in a random subspace of ℝn of dimension up to Ω(√n). These recovery guarantees match the best known ones of Barak, Kelner, and Steurer (STOC 2014) up to logarithmic factors. For tensor decomposition, we give an algorithm with running time close to linear in the input size (with exponent ≈ 1.125) that approximately recovers a component of a random 3-tensor over ℝn of rank up to Ω(n4/3). The best previous algorithm for this problem due to Ge and Ma (RANDOM 2015) works up to rank Ω(n3/2) but requires quasipolynomial time.
Year
DOI
Venue
2016
10.1145/2897518.2897529
STOC '16: Symposium on Theory of Computing Cambridge MA USA June, 2016
Keywords
Field
DocType
tensor decomposition,sparse recovery,tensor principal component analysis,spectral methods,semidefinite programming,random matrices
Discrete mathematics,Combinatorics,Exponent,Subspace topology,Algorithm,Linear subspace,Spectral method,Logarithm,Explained sum of squares,Semidefinite programming,Mathematics,Random matrix
Conference
ISSN
ISBN
Citations 
0737-8017
978-1-4503-4132-5
18
PageRank 
References 
Authors
0.67
30
4
Name
Order
Citations
PageRank
Samuel Hopkins1889.47
Tselil Schramm2788.82
Jonathan Shi3623.29
David Steurer493444.91