Abstract | ||
---|---|---|
Recently, the scheme of model-X knockoffs was proposed as a promising solution to address controlled feature selection under high-dimensional finite-sample settings. However, the procedure of model-X knockoffs depends heavily on the coefficient-based feature importance and only concerns the control of false discovery rate (FDR). To further improve its adaptivity and flexibility, in this paper, we propose an error-based knockoff inference method by integrating the knockoff features, the error-based feature importance statistics, and the stepdown procedure together. The proposed inference procedure does not require specifying a regression model and can handle feature selection with theoretical guarantees on controlling false discovery proportion (FDP), FDR, or k-familywise error rate (k-FWER). Empirical evaluations demonstrate the competitive performance of our approach on both simulated and real data. |
Year | Venue | Keywords |
---|---|---|
2022 | AAAI Conference on Artificial Intelligence | Machine Learning (ML) |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
0 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xuebin Zhao | 1 | 0 | 0.34 |
Hong Chen | 2 | 173 | 17.52 |
Yingjie Wang | 3 | 0 | 1.69 |
Weifu Li | 4 | 3 | 1.46 |
Tieliang Gong | 5 | 2 | 4.75 |
Yulong Wang | 6 | 29 | 7.01 |
Feng Zheng | 7 | 369 | 31.93 |