Title
Blockwise coordinate descent schemes for sparse representation
Abstract
The current sparse representation framework is to decouple it as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in bases and codes separately. In this paper, we treat elements both in bases and codes ho-mogenously. The original optimization is directly decoupled as several blockwise alternate subproblems rather than above two. Hence, sparse coding and bases learning optimizations are coupled together. And the variables involved in the optimization problems are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact block coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. Thus the algorithm is simple, efficient and effective. Experimental results show that our algorithm significantly accelerates the learning process.
Year
DOI
Venue
2014
10.1109/ICASSP.2014.6854608
ICASSP
Keywords
Field
DocType
learning optimizations,dictionary learning,encoding,blockwise coordinate descent schemes,sparse coding,parabolic equations,parabolic function,optimization,coordinate descent,sparse representation framework,learning process,convergence,minimization,sparse matrices,dictionaries,linear programming
Monotonic function,Mathematical optimization,Convexity,K-SVD,Neural coding,Computer science,Sparse approximation,Separable space,Coordinate descent,Optimization problem
Conference
ISSN
Citations 
PageRank 
1520-6149
9
0.46
References 
Authors
12
5
Name
Order
Citations
PageRank
Bao-Di Liu116627.34
Yu-Xiong Wang235417.75
Bin Shen343134.86
Yu Jin Zhang4127293.14
Yanjiang Wang5158.65