Title | ||
---|---|---|
A criterion for vector autoregressive model selection based on Kullback's symmetric divergence |
Abstract | ||
---|---|---|
The Kullback information criterion, KIC, and its univariate bias-corrected version, KICc, are two recently developed criteria for model selection. A small sample model selection criterion for vector autoregressive models is developed. The proposed criterion is named KICvc, where the notation "vc" stands for vector correction, and it can be considered as an extension of KIC for vector autoregressive models. KICvc is an unbiased estimator of a variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Simulation results shows that the proposed criterion estimates the model order more accurately than any other asymptotically efficient method when applied to vector autoregressive model selection in small samples. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1109/ICASSP.2005.1415954 | ICASSP '05). IEEE International Conference |
Keywords | Field | DocType |
autoregressive processes,information theory,parameter estimation,signal processing,vectors,Kullback symmetric divergence,signal processing,univariate bias-corrected Kullback information criterion,vector autoregressive model selection,vector correction | Information theory,Autoregressive model,Signal processing,Divergence,Pattern recognition,Model selection,Bias of an estimator,Artificial intelligence,Estimation theory,Univariate,Mathematics | Conference |
Volume | ISSN | ISBN |
4 | 1520-6149 | 0-7803-8874-7 |
Citations | PageRank | References |
0 | 0.34 | 4 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Abd-Krim Seghouane | 1 | 78 | 12.27 |