Abstract | ||
---|---|---|
The problem of distortionless encoding when the parameters of the probabilistic model of a source are unknown is considered from a statistical decision theory point of view. A class of predictive and nonpredictive codes is proposed that are optimal within this framework. Specifically, it is shown that the codeword length of the proposed predictive code coincides with that of the proposed nonpredictive code for any source sequence. A bound for the redundancy for universal coding is given in terms of the supremum of the Bayes risk. If this supremum exists, then there exists a minimax code whose mean code length approaches it in the proposed class of codes, and the minimax code is given by the Bayes solution relative to the prior distribution of the source parameters that maximizes the Bayes risk |
Year | DOI | Venue |
---|---|---|
1991 | 10.1109/18.133247 | Information Theory, IEEE Transactions |
Keywords | Field | DocType |
Bayes methods,decision theory,encoding,Bayes decision theory,Bayes risk,codeword length,distortionless codes,encoding,nonpredictive codes,predictive code,probabilistic model,redundancy,source sequence,statistical decision theory,universal coding | Minimax,Computer science,Decision theory,Code word,Artificial intelligence,Bayes' theorem,Information theory,Discrete mathematics,Pattern recognition,Algorithm,Statistical model,Linear code,Prior probability | Journal |
Volume | Issue | ISSN |
37 | 5 | 0018-9448 |
Citations | PageRank | References |
18 | 1.83 | 7 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Toshiyasu Matsushima | 1 | 97 | 32.76 |
H. Inazumi | 2 | 92 | 13.83 |
Shigeichi Hirasawa | 3 | 322 | 150.91 |