Title
Nonparametric Direct Entropy Difference Estimation
Abstract
We propose a nonparametric method to directly estimate the difference of Shannon entropy between two continuous random variables using finite number of samples. This method is based on a k-nearest-neighbor approach. We provide a finite sample analysis of the bias and variance of our proposed estimator. Numerical experiments show that our method performs better than estimating the entropy of two random variables separately. As an application of our estimator, we show that it can be used to significantly improve the performance of mutual information estimation using k-nearest-neighbor method for strongly dependent variables.
Year
DOI
Venue
2018
10.1109/ITW.2018.8613521
2018 IEEE Information Theory Workshop (ITW)
Keywords
Field
DocType
entropy,mutual information,k-nearest neighbor
k-nearest neighbors algorithm,Applied mathematics,Discrete mathematics,Random variable,Finite set,Computer science,Nonparametric statistics,Mutual information,Variables,Entropy (information theory),Estimator
Conference
ISSN
ISBN
Citations 
2475-420X
978-1-5386-3600-8
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Puning Zhao102.70
Lifeng Lai22289167.78