Title
The Behaviour of the Akaike Information Criterion When Applied to Non-nested Sequences of Models
Abstract
A typical approach to the problem of selecting between models of differing complexity is to choose the model with the minimum Akaike Information Criterion (AIC) score. This paper examines a common scenario in which there is more than one candidate model with the same number of free parameters which violates the conditions under which AIC was derived. The main result of this paper is a novel upper bound that quantifies the poor performance of the AIC criterion when applied in this setting. Crucially, the upper-bound does not depend on the sample size and will not disappear even asymptotically. Additionally, an AIC-like criterion for sparse feature selection in regression models is derived, and simulation results in the case of denoising a signal by wavelet thresholding demonstrate the new AIC approach is competitive with Sure Shrink thresholding.
Year
DOI
Venue
2010
10.1007/978-3-642-17432-2_23
AI 2010: ADVANCES IN ARTIFICIAL INTELLIGENCE
Keywords
Field
DocType
sample size,feature selection,upper bound,regression model,akaike information criterion
Bayesian information criterion,Akaike information criterion,Stepwise regression,Feature selection,Pattern recognition,Upper and lower bounds,Algorithm,Artificial intelligence,Thresholding,Sample size determination,Mathematics,Free parameter
Conference
Volume
ISSN
Citations 
6464
0302-9743
0
PageRank 
References 
Authors
0.34
1
2
Name
Order
Citations
PageRank
D. F. Schmidt1111.28
Enes Makalic25511.54