Title
Group sparse optimization via lp,q regularization
Abstract
In this paper, we investigate a group sparse optimization problem via lp,q regularization in three aspects: theory, algorithm and application. In the theoretical aspect, by introducing a notion of group restricted eigenvalue condition, we establish an oracle property and a global recovery bound of order O(λ2/2-q) for any point in a level set of the lp,q regularization problem, and by virtue of modern variational analysis techniques, we also provide a local analysis of recovery bound of order O(λ2) for a path of local minima. In the algorithmic aspect, we apply the well-known proximal gradient method to solve the p,q regularization problems, either by analytically solving some specific lp,q regularization subproblems, or by using the Newton method to solve general lp,q regularization subproblems. In particular, we establish a local linear convergence rate of the proximal gradient method for solving the l1,q regularization problem under some mild conditions and by first proving a second-order growth condition. As a consequence, the local linear convergence rate of proximal gradient method for solving the usual lq regularization problem (0 < q < 1) is obtained. Finally in the aspect of application, we present some numerical results on both the simulated data and the real data in gene transcriptional regulation.
Year
Venue
Field
2017
Journal of Machine Learning Research
Pattern recognition,Proximal Gradient Methods,Algorithm,Regularization (mathematics),Artificial intelligence,Optimization problem,Mathematics
DocType
Volume
Issue
Journal
18
Issue-in-Progress
ISSN
Citations 
PageRank 
1532-4435
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Yaohua Hu1144.35
Chong Li2227.35
K. W. Meng3132.41
Jing Qin4110995.43
Xiaoqi Yang512620.85