Title
Scaling of model approximation errors and expected entropy distances.
Abstract
We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant 1 - gamma. For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected values of the divergence behave in a similar way. These results serve as a reference to rank the approximation capabilities of other statistical models.
Year
DOI
Venue
2014
10.14736/kyb-2014-2-0234
KYBERNETIKA
Keywords
Field
DocType
exponential families,KL divergence,MLE,Dirichlet prior
Mathematical optimization,Uniform distribution (continuous),Expected value,Statistical model,Dirichlet distribution,Prior probability,Scaling,State space,Mathematics,Bounded function
Journal
Volume
Issue
ISSN
50
SP2
0023-5954
Citations 
PageRank 
References 
4
0.42
3
Authors
2
Name
Order
Citations
PageRank
Guido Montúfar122331.42
Johannes Rauh215216.63