Title
Estimation of KL Divergence: Optimal Minimax Rate.
Abstract
The problem of estimating the Kullback-Leibler divergence D(P∥Q) between two unknown distributions P and Q is studied, under the assumption that the alphabet size k of the distributions can scale to infinity. The estimation is based on m independent samples drawn from P and n independent samples drawn from Q. It is first shown that there does not exist any consistent estimator that guarantees asym...
Year
DOI
Venue
2018
10.1109/TIT.2018.2805844
IEEE Transactions on Information Theory
Keywords
DocType
Volume
Estimation,Entropy,Upper bound,Information theory,Electronic mail,Histograms,Complexity theory
Journal
64
Issue
ISSN
Citations 
4
0018-9448
6
PageRank 
References 
Authors
0.50
13
4
Name
Order
Citations
PageRank
Yuheng Bu1106.06
Shaofeng Zou25314.20
Yingbin Liang31646147.64
Venugopal V. Veeravalli41566150.28