Abstract | ||
---|---|---|
Prediction variables of the regression model are grouped in many application problems. For example, a factor in an analysis of variance can have several levels or each original prediction variable in additive models can be expanded into different order polynomials or a set of basis functions. It is essential to select important groups and individual variables within the selected groups. In this study, we propose the objective Bayesian group and individual variable selections within the selected groups in the regression model to reduce the computational cost, even though the number of regression variables is large. Besides, we examine the consistency of the proposed group variable selection procedure. The proposed objective Bayesian approach is investigated using simulation and real data examples. The comparisons between the penalized regression approaches, Bayesian group lasso and the proposed method are presented. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1007/s00180-021-01160-w | Computational Statistics |
Keywords | DocType | Volume |
Bayes factor, Group variable selection, Intrinsic prior, Linear regression model | Journal | 37 |
Issue | ISSN | Citations |
3 | 0943-4062 | 0 |
PageRank | References | Authors |
0.34 | 1 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sang Gil Kang | 1 | 0 | 0.34 |
Woo Dong Lee | 2 | 0 | 0.34 |
Yongku Kim | 3 | 1 | 6.03 |