Title
Adapting to Unknown Noise Distribution in Matrix Denoising.
Abstract
We consider the problem of estimating an unknown matrix $boldsymbol{X}in {mathbb R}^{mtimes n}$, from observations $boldsymbol{Y} = boldsymbol{X}+boldsymbol{W}$ where $boldsymbol{W}$ is a noise matrix with independent and identically distributed entries, as to minimize estimation error measured in operator norm. Assuming that the underlying signal $boldsymbol{X}$ is low-rank and incoherent with respect to the canonical basis, we prove that minimax risk is equivalent to $(sqrt{m}veesqrt{n})/sqrt{I_W}$ in the high-dimensional limit $m,ntoinfty$, where $I_W$ is the Fisher information of the noise. Crucially, we develop an efficient procedure that achieves this risk, adaptively over the noise distribution (under certain regularity assumptions). Letting $boldsymbol{X} = boldsymbol{U}{boldsymbol{Sigma}}boldsymbol{V}^{{sf T}}$ --where $boldsymbol{U}in {mathbb R}^{mtimes r}$, $boldsymbol{V}in{mathbb R}^{ntimes r}$ are orthogonal, and $r$ is kept fixed as $m,ntoinfty$-- we use our method to estimate $boldsymbol{U}$, $boldsymbol{V}$. Standard spectral methods provide non-trivial estimates of the factors $boldsymbol{U},boldsymbol{V}$ (weak recovery) only if the singular values of $boldsymbol{X}$ are larger than $(mn)^{1/4}{rm Var}(W_{11})^{1/2}$. We prove that the new approach achieves weak recovery down to the the information-theoretically optimal threshold $(mn)^{1/4}I_W^{1/2}$.
Year
Venue
Field
2018
arXiv: Statistics Theory
Noise reduction,Standard basis,Combinatorics,Singular value,Matrix (mathematics),Operator norm,Independent and identically distributed random variables,Fisher information,Statistics,Mathematics
DocType
Volume
Citations 
Journal
abs/1810.02954
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Andrea Montanari12863195.89
Feng Ruan231.75
Jun Yan3138.90