Abstract | ||
---|---|---|
A new tensor approximation method is developed based on the CANDECOMP/PARAFAC (CP) factorization that enjoys both sparsity (i.e., yielding factor matrices with some nonzero elements) and resistance to outliers and non-Gaussian measurement noise. This method utilizes a robust bounded loss function for errors in the low-rank tensor approximation while encouraging sparsity with Lasso (or ℓ1-) regularization to the factor matrices (of a tensor data). A simple alternating, iteratively reweighted (IRW) Lasso algorithm is proposed to solve the resulting optimization problem. Simulation studies illustrate that the proposed method provides excellent performance in terms of mean square error accuracy for heavy-tailed noise conditions, with relatively small loss in conventional Gaussian noise. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/SSP.2014.6884665 | SSP |
Keywords | Field | DocType |
candecomp-parafac factorization,cp factorization,big data,nongaussian measurement noise,heavy-tailed noise conditions,iteratively reweighted least squares,outliers,factor matrices,lasso,optimization problem,robust bounded loss function,irw lasso algorithm,robust iteratively reweighted lasso regularization,matrix decomposition,low-rank tensor approximation method,gaussian noise,sparse tensor factorization,regularization,robust loss function,tensors,iterative methods,ℓ1-regularization,mean square error accuracy | Mathematical optimization,Tensor,Matrix (mathematics),Lasso (statistics),Algorithm,Mean squared error,Iteratively reweighted least squares,Regularization (mathematics),Gaussian noise,Optimization problem,Mathematics | Conference |
Citations | PageRank | References |
2 | 0.38 | 5 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hyon-Jung Kim | 1 | 12 | 2.48 |
Esa Ollila | 2 | 351 | 33.51 |
Visa Koivunen | 3 | 1917 | 187.81 |
H. V. Poor | 4 | 25411 | 1951.66 |