Title
Improved finite-sample estimate of a nonparametric f-divergence.
Abstract
Information divergence functions allow us to measure distances between probability density functions. We focus on the case where we only have data from the two distributions and have no knowledge of the underlying models from which the data is sampled. In this scenario, we consider an f-divergence for which there exists an asymptotically consistent, nonparametric estimator based on minimum spanning trees, the D p divergence. Nonparametric estimators are known to have slow convergence rates in higher dimensions (d u003e 4), resulting in a large bias for small datasets. Based on experimental validation, we conjecture that the original estimator follows a power law convergence model and introduce a new estimator based on a bootstrap sampling scheme that results in a reduced bias. Experiments on real and artificial data show that the new estimator results in improved estimates of the D p divergence when compared against the original estimator.
Year
Venue
Field
2017
ACSSC
Convergence (routing),Applied mathematics,Mathematical optimization,Divergence,Computer science,Bootstrapping (statistics),Nonparametric statistics,Probability density function,f-divergence,Kullback–Leibler divergence,Estimator
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Prad Kadambi100.34
Alan Wisler262.53
Visar Berisha37622.38