Title
Distributed Sparse Optimization With Minimax Concave Regularization
Abstract
We study the use of weakly-convex minmax concave (MC) regularizes in distributed sparse optimization. The global cost function is the squared error penalized by the MC regularizer. While it is convex as long as the whole system is overdetermined and the regularization parameter is sufficiently small, the local cost of each node is usually nonconvex as the system from local measurements are underdetermined in practical applications. The Moreau decomposition is applied to the MC regularizer so that the total cost takes the form of a smooth function plus the rescaled ℓ <inf xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</inf> norm. We propose two solvers: the first applies the proximal gradient exact first-order algorithm (PG-EXTRA) directly to our cost, while the second is based on convex relaxation of the local costs to ensure convergence. Numerical examples show that the proposed approaches attain significant gains compared to the ℓ <inf xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</inf> -based PG-EXTRA.
Year
DOI
Venue
2021
10.1109/SSP49050.2021.9513764
2021 IEEE Statistical Signal Processing Workshop (SSP)
Keywords
DocType
ISSN
weakly-convex minmax concave,distributed sparse optimization,global cost function,MC regularizer,smooth function,proximal gradient exact first-order algorithm,convex relaxation,minimax concave regularization,PG-EXTRA
Conference
2373-0803
ISBN
Citations 
PageRank 
978-1-7281-5768-9
1
0.35
References 
Authors
0
3
Name
Order
Citations
PageRank
Kei Komuro110.35
Masahiro Yukawa227230.44
Renato L. G. Cavalcante318024.21