Title
Error-Based Knockoffs Inference for Controlled Feature Selection.
Abstract
Recently, the scheme of model-X knockoffs was proposed as a promising solution to address controlled feature selection under high-dimensional finite-sample settings. However, the procedure of model-X knockoffs depends heavily on the coefficient-based feature importance and only concerns the control of false discovery rate (FDR). To further improve its adaptivity and flexibility, in this paper, we propose an error-based knockoff inference method by integrating the knockoff features, the error-based feature importance statistics, and the stepdown procedure together. The proposed inference procedure does not require specifying a regression model and can handle feature selection with theoretical guarantees on controlling false discovery proportion (FDP), FDR, or k-familywise error rate (k-FWER). Empirical evaluations demonstrate the competitive performance of our approach on both simulated and real data.
Year
Venue
Keywords
2022
AAAI Conference on Artificial Intelligence
Machine Learning (ML)
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Xuebin Zhao100.34
Hong Chen217317.52
Yingjie Wang301.69
Weifu Li431.46
Tieliang Gong524.75
Yulong Wang6297.01
Feng Zheng736931.93