Title
A data-driven basis for direct estimation of functionals of distributions.
Abstract
A number of fundamental quantities in statistical signal processing and information theory can be expressed as integral functions of two probability density functions. Such quantities are called density functionals as they map density functions onto the real line. For example, information divergence functions measure the dissimilarity between two probability density functions and are particularly useful in a number of applications. Typically, estimating these quantities requires complete knowledge of the underlying distribution followed by multi-dimensional integration. Existing methods make parametric assumptions about the data distribution or use non-parametric density estimation followed by high-dimensional integration. In this paper, we propose a new alternative. We introduce the concept of basis functions - functions of distributions whose value we can estimate given only samples from the underlying distributions without requiring distribution fitting or direct integration. We derive a new data-driven complete basis that is similar to the deterministic Bernstein polynomial basis and develop two methods for performing basis expansions of functionals of two distributions. We also show that the new basis set allows us to approximate functions of distributions as closely as desired. Finally, we evaluate the methodology by developing data driven estimators for the Kullback-Leibler divergences and the Hellinger distance and by constructing tight data-driven bounds on the Bayes Error Rate.
Year
Venue
Field
2017
arXiv: Information Theory
Statistical physics,Data-driven,Computer science
DocType
Volume
Citations 
Journal
abs/1702.06516
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Alan Wisler162.53
Visar Berisha27622.38
Andreas S. Spanias352887.90
Iii Alfred O. Hero41713197.61