Title
Classification With the Sparse Group Lasso
Abstract
Classification with a sparsity constraint on the solution plays a central role in many high dimensional signal processing applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or discarded. In many applications, however, this can be too restrictive. In this paper, we are interested in a less restrictive form of structured sparse feature selection: We assume that while features can be grouped according to some notion of similarity, not all features in a group need be selected for the task at hand. The Sparse Group Lasso (SGL) was proposed to solve problems of this form. The main contributions of this paper are a new procedure called Sparse Overlapping Group (SOG) lasso, an extension to the SGL to overlapping groups and theoretical sample complexity bounds for the same. We establish model selection error bounds that specializes to many other cases. We experimentally validate our proposed method on both real and toy datasets.
Year
DOI
Venue
2016
10.1109/TSP.2015.2488586
Signal Processing, IEEE Transactions
Keywords
Field
DocType
Algorithms,compressed sensing,statistical learning,structured sparsity
Signal processing,Mathematical optimization,Pattern recognition,Feature selection,Computer science,Sparse approximation,Lasso (statistics),Model selection,Correlation,Artificial intelligence,Compressed sensing,Sparse matrix
Journal
Volume
Issue
ISSN
64
2
1053-587X
Citations 
PageRank 
References 
11
0.68
20
Authors
4
Name
Order
Citations
PageRank
Nikhil S. Rao117815.75
Robert Nowak27309672.50
Christopher R. Cox3111.35
Timothy T. Rogers410221.17