Title
E-ENDPP: a safe feature selection rule for speeding up Elastic Net
Abstract
Lasso is a popular regression model, which can do automatic variable selection and continuous shrinkage simultaneously. The Elastic Net is one of the corrective methods of Lasso, which selects groups of correlated variables. It is particularly useful when the number of features p is much bigger than the number of observations n. However, the training efficiency of the Elastic Net for high-dimensional data remains a challenge. Therefore, in this paper, we propose a new safe screening rule, i.e., E-ENDPP, for the Elastic Net problem which can identify the inactive features prior to training. Then, the inactive features or predictors can be removed to reduce the size of problem and accelerate the training speed. Since this E-ENDPP is derived from the optimality conditions of the model, it can be guaranteed in theory that E-ENDPP will give identical solutions with the original model. Simulation studies and real data examples show that our proposed E-ENDPP can substantially accelerate the training speed of the Elastic Net without affecting its accuracy.
Year
DOI
Venue
2019
10.1007/s10489-018-1295-y
Applied Intelligence
Keywords
Field
DocType
Elastic Net,Lasso,Screening rule,Feature selection
Feature selection,Regression analysis,Computer science,Elastic net regularization,Lasso (statistics),Algorithm,Artificial intelligence,Machine learning
Journal
Volume
Issue
ISSN
49.0
2
1573-7497
Citations 
PageRank 
References 
0
0.34
10
Authors
4
Name
Order
Citations
PageRank
Yitian Xu148935.06
Ying Tian200.34
Xianli Pan31127.39
Hongmei Wang43113.44