Abstract | ||
---|---|---|
We review the principles of Minimum Description Length and Stochastic Complexity as used in data compression and statistical modeling. Stochastic complexity is formulated as the solution to optimum universal coding problems extending Shannon's basic source coding theorem. The normalized maxi- mized likelihood, mixture, and predictive codings are each shown to achieve the stochastic complexity to within asymptotically vanishing terms. We assess the performance of the minimum description length criterion both from the vantage point of quality of data compression and accuracy of statistical inference. Context tree modeling, density estimation, and model selection in Gaussian linear regression serve as examples. |
Year | DOI | Venue |
---|---|---|
1998 | 10.1109/18.720554 | IEEE Transactions on Information Theory |
Keywords | Field | DocType |
computational complexity,data compression,information theory,maximum likelihood estimation,modelling,prediction theory,reviews,source coding,statistical analysis,stochastic processes,Gaussian linear regression,Shannon coding,context tree modeling,data compression,density estimation,minimum description length principle,mixture coding,model selection,normalized maximized likelihood coding,optimum universal coding problem,predictive coding,review,source coding theorem,statistical inference,statistical modeling,stochastic complexity | Statistical inference,Artificial intelligence,Information theory,Discrete mathematics,Pattern recognition,Shannon coding,Minimum description length,Algorithm,Model selection,Shannon's source coding theorem,Statistical model,Data compression,Mathematics | Journal |
Volume | Issue | ISSN |
44 | 6 | 0018-9448 |
Citations | PageRank | References |
400 | 31.01 | 18 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Andrew R. Barron | 1 | 616 | 125.60 |
Jorma Rissanen | 2 | 1665 | 798.14 |
Bin Yu | 3 | 1984 | 241.03 |