Abstract | ||
---|---|---|
In this paper, we introduce restricted empirical likelihood and restricted penalized empirical likelihood estimators. These estimators are obtained under both unbiasedness and minimum variance criteria for estimating equations. These scopes produce estimators which have appealing properties and particularly are more robust against outliers than some currently existing estimators. Assuming some prior densities, we develop the Bayesian analysis of the restricted empirical likelihood and the restricted penalized empirical likelihood. Moreover, we provide an EM algorithm to approximate hyper-parameters. Finally, we carry out a simulation study and illustrate the theoretical results for a real data set. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1007/s00180-020-01046-3 | COMPUTATIONAL STATISTICS |
Keywords | DocType | Volume |
Empirical likelihood, Restricted empirical likelihood, Bayesian optimization, Gibbs sampling, EM algorithm, Estimating equations | Journal | 36 |
Issue | ISSN | Citations |
2 | 0943-4062 | 0 |
PageRank | References | Authors |
0.34 | 0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mahdieh Bayati | 1 | 0 | 0.34 |
Seyed Kamran Ghoreishi | 2 | 0 | 0.34 |
Jingjing Wu | 3 | 0 | 0.34 |