Title
The index entropy of a mismatched codebook
Abstract
Entropy coding is a well-known technique to reduce the rate of a quantizer. It plays a particularly important role in universal quantization, where the quantizer codebook is not matched to the source statistics. We investigate the gain due to entropy coding by considering the entropy of the index of the first codeword, in a mismatched random codebook, that D-matches the source word. We show that the index entropy is strictly lower than the "uncoded" rate of the code, provided that the entropy is conditioned on the codebook. The number of bits saved by conditional entropy coding is equal to the divergence between the "favorite type" (the limiting empirical distribution of the first D-matching codeword) and the codebook-generating distribution. Specific examples are provided
Year
DOI
Venue
2002
10.1109/18.979328
IEEE Transactions on Information Theory
Keywords
Field
DocType
source word,empirical distribution,source statistic,d-matching codeword,mismatched random codebook,codebook-generating distribution,index entropy,conditional entropy coding,quantizer codebook,mismatched codebook,entropy coding,approximate string matching,lattices,quantization,indexation,gain,source coding,information theory,source code,conditional entropy,statistics
Cross entropy,Discrete mathematics,Entropy encoding,Rényi entropy,Shannon's source coding theorem,Principle of maximum entropy,Conditional entropy,Arithmetic coding,Mathematics,Maximum entropy probability distribution
Journal
Volume
Issue
ISSN
48
2
0018-9448
Citations 
PageRank 
References 
3
0.54
10
Authors
1
Name
Order
Citations
PageRank
Zamir, R.11496141.87