Abstract | ||
---|---|---|
The problem of estimating the Kullback-Leibler divergence D(P∥Q) between two unknown distributions P and Q is studied, under the assumption that the alphabet size k of the distributions can scale to infinity. The estimation is based on m independent samples drawn from P and n independent samples drawn from Q. It is first shown that there does not exist any consistent estimator that guarantees asym... |
Year | DOI | Venue |
---|---|---|
2018 | 10.1109/TIT.2018.2805844 | IEEE Transactions on Information Theory |
Keywords | DocType | Volume |
Estimation,Entropy,Upper bound,Information theory,Electronic mail,Histograms,Complexity theory | Journal | 64 |
Issue | ISSN | Citations |
4 | 0018-9448 | 6 |
PageRank | References | Authors |
0.50 | 13 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yuheng Bu | 1 | 10 | 6.06 |
Shaofeng Zou | 2 | 53 | 14.20 |
Yingbin Liang | 3 | 1646 | 147.64 |
Venugopal V. Veeravalli | 4 | 1566 | 150.28 |