Title
Accelerated <italic>Log</italic>-Regularized Convolutional Transform Learning and Its Convergence Guarantee
Abstract
Convolutional transform learning (CTL), learning filters by minimizing the data fidelity loss function in an unsupervised way, is becoming very pervasive, resulting from keeping the best of both worlds: the benefit of unsupervised learning and the success of the convolutional neural network. There have been growing interests in developing efficient CTL algorithms. However, developing a convergent and accelerated CTL algorithm with accurate representations simultaneously with proper sparsity is an open problem. This article presents a new CTL framework with a <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$log$ </tex-math></inline-formula> regularizer that can not only obtain accurate representations but also yield strong sparsity. To efficiently address our nonconvex composite optimization, we propose to employ the proximal difference of the convex algorithm (PDCA) which relies on decomposing the nonconvex regularizer into the difference of two convex parts and then optimizes the convex subproblems. Furthermore, we introduce the extrapolation technology to accelerate the algorithm, leading to a fast and efficient CTL algorithm. In particular, we provide a rigorous convergence analysis for the proposed algorithm under the accelerated PDCA. The experimental results demonstrate that the proposed algorithm can converge more stably to desirable solutions with lower approximation error and simultaneously with stronger sparsity and, thus, learn filters efficiently. Meanwhile, the convergence speed is faster than the existing CTL algorithms.
Year
DOI
Venue
2022
10.1109/TCYB.2021.3067352
IEEE Transactions on Cybernetics
Keywords
DocType
Volume
Convergence guarantee,convolutional transform learning (CTL),extrapolation,log-regularizer,the difference of convex
Journal
52
Issue
ISSN
Citations 
10
2168-2267
2
PageRank 
References 
Authors
0.36
25
5
Name
Order
Citations
PageRank
Zhenni Li19914.48
Haoli Zhao220.36
Yongcheng Guo320.36
Zu-yuan Yang431224.12
Shengli Xie52530161.51