Abstract | ||
---|---|---|
Estimating Kullback-Leibler divergence from identically and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/ISIT44484.2020.9174033 | ISIT |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Puning Zhao | 1 | 0 | 0.34 |
Lifeng Lai | 2 | 2289 | 167.78 |