Title
2D sparse dictionary learning via tensor decomposition
Abstract
The existing dictionary learning methods mostly focus on ID signals, leading to the disadvantage of incurring overload of memory and computation if the size of training samples is large enough. Recently, 2D dictionary learning paradigm has been validated to save massive memory usage, especially for large-scale problems. To address this issue, we propose novel 2D dictionary learning algorithms based on tensors in this paper. Our learning problem is efficiently solved by CANDECOMP/PARAFAC (CP) decomposition. In addition, our algorithms guarantee sparsity constraint, which makes that sparse representation of the learned dictionary is equivalent to the ground truth. Experimental results confirm the effectness of our methods.
Year
DOI
Venue
2014
10.1109/GlobalSIP.2014.7032166
GlobalSIP
Keywords
Field
DocType
signal representation,2d sparse dictionary learning,sparse representation,tensor decomposition,candecomp/parafac decomposition,dictionary learning,2d matrices,cp decomposition,sparsity constraint,candecomp/parafac (cp) decomposition,tensor,singular value decomposition,tensors,matrix decomposition,tensile stress,information processing,sparse matrices,big data,dictionaries
Information processing,K-SVD,Computer science,Matrix decomposition,Sparse approximation,Theoretical computer science,Ground truth,Artificial intelligence,Big data,Sparse matrix,Machine learning,Computation
Conference
Citations 
PageRank 
References 
3
0.37
8
Authors
3
Name
Order
Citations
PageRank
Sung-Hsien Hsieh14813.71
Chun-shien Lu21238104.71
Soo-Chang Pei344946.82