Title
Doubly sparse transform learning with convergence guarantees
Abstract
The sparsity of natural signals in transform domains such as the DCT has been heavily exploited in various applications. Recently, we introduced the idea of learning sparsifying transforms from data, and demonstrated the usefulness of learnt transforms in image representation, and denoising. However, the learning formulations therein were non-convex, and the algorithms lacked strong convergence properties. In this work, we propose a novel convex formulation for square sparsifying transform learning. We also enforce a doubly sparse structure on the transform, which makes its learning, storage, and implementation efficient. Our algorithm is guaranteed to converge to a global optimum, and moreover converges quickly. We also introduce a non-convex variant of the convex formulation, for which the algorithm is locally convergent. We show the superior promise of our learnt transforms as compared to analytical sparsifying transforms such as the DCT for image representation.
Year
DOI
Venue
2014
10.1109/ICASSP.2014.6854607
Acoustics, Speech and Signal Processing
Keywords
Field
DocType
convex programming,discrete cosine transforms,image representation,learning (artificial intelligence),DCT,convergence guarantees,convex formulation,discrete cosine transforms,doubly sparse transform learning,image denoising,image representation,learning formulations,natural signal sparsity,square sparsifying transform learning,Convex learning,Sparse representations
Convergence (routing),Noise reduction,Mathematical optimization,Computer science,Discrete cosine transform,Image representation,Global optimum,Regular polygon,Theoretical computer science,Local convergence
Conference
ISSN
Citations 
PageRank 
1520-6149
2
0.42
References 
Authors
16
2
Name
Order
Citations
PageRank
Saiprasad Ravishankar158736.58
Yoram Bresler21104119.17