Title
Model selection by sequentially normalized least squares
Abstract
Model selection by means of the predictive least squares (PLS) principle has been thoroughly studied in the context of regression model selection and autoregressive (AR) model order estimation. We introduce a new criterion based on sequentially minimized squared deviations, which are smaller than both the usual least squares and the squared prediction errors used in PLS. We also prove that our criterion has a probabilistic interpretation as a model which is asymptotically optimal within the given class of distributions by reaching the lower bound on the logarithmic prediction errors, given by the so called stochastic complexity, and approximated by BIC. This holds when the regressor (design) matrix is non-random or determined by the observed data as in AR models. The advantages of the criterion include the fact that it can be evaluated efficiently and exactly, without asymptotic approximations, and importantly, there are no adjustable hyper-parameters, which makes it applicable to both small and large amounts of data.
Year
DOI
Venue
2010
10.1016/j.jmva.2009.12.009
J. Multivariate Analysis
Keywords
Field
DocType
model order estimation,time series,observed data,order estimation,linear regression,logarithmic prediction error,62l12,regression model selection,predictive least squares,62b10,model selection,prediction error,62j05,adjustable hyper-parameters,new criterion,asymptotic approximation,94a15,ar model,62m10,time series model,lower bound,regression model,least square
Least squares,Econometrics,Autoregressive model,Model selection,Statistical model,Residual sum of squares,Statistics,Asymptotically optimal algorithm,Mathematics,Linear regression,Autocorrelation
Journal
Volume
Issue
ISSN
101
4
Journal of Multivariate Analysis
Citations 
PageRank 
References 
7
0.64
6
Authors
3
Name
Order
Citations
PageRank
Jorma Rissanen11665798.14
Teemu Roos243661.32
Petri Myllymäki380288.90