Title
Sequential normalized maximum likelihood in log-loss prediction
Abstract
The paper considers sequential prediction of individual sequences with log loss using an exponential family of distributions. We first show that the commonly used maximum likelihood strategy is suboptimal and requires an additional assumption about boundedness of the data sequence. We then show that both problems can be be addressed by adding the currently predicted outcome to the calculation of the maximum likelihood, followed by normalization of the distribution. The strategy obtained in this way is known in the literature as the sequential normalized maximum likelihood (SNML) strategy. We show that for general exponential families, the regret is bounded by the familiar (k/2)logn and thus optimal up to O(1). We also introduce an approximation to SNML, flattened maximum likelihood, much easier to compute that SNML itself, while retaining the optimal regret under some additional assumptions. We finally discuss the relationship to the Bayes strategy with Jeffreys' prior.
Year
DOI
Venue
2012
10.1109/ITW.2012.6404734
Information Theory Workshop
Keywords
Field
DocType
Bayes methods,exponential distribution,maximum likelihood sequence estimation,Bayes strategy,Jeffreys prior,SNML,data sequence,exponential distributions family,flattened maximum likelihood,log-loss prediction,sequential normalized maximum likelihood
Applied mathematics,Discrete mathematics,Likelihood function,Expectation–maximization algorithm,Bayes factor,Maximum a posteriori estimation,Estimation theory,Restricted maximum likelihood,Statistics,Maximum likelihood sequence estimation,Mathematics,Likelihood principle
Conference
ISBN
Citations 
PageRank 
978-1-4673-0222-7
1
0.34
References 
Authors
5
2
Name
Order
Citations
PageRank
Wojciech Kotlowski115816.32
Peter D. Grünwald27310.86