Title
Analysis of K Nearest Neighbor KL Divergence Estimation for Continuous Distributions
Abstract
Estimating Kullback-Leibler divergence from identically and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator.
Year
DOI
Venue
2020
10.1109/ISIT44484.2020.9174033
ISIT
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Puning Zhao100.34
Lifeng Lai22289167.78