Title
A finite-sample, distribution-free, probabilistic lower bound on mutual information
Abstract
For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano’s inequality where the continuous random variable is discretized.
Year
DOI
Venue
2011
10.1162/NECO_a_00144
Neural Computation
Keywords
Field
DocType
mutual information,conditional distribution,random variable,communication channels,lower bound
Sampling distribution,Chapman–Robbins bound,Random variable,Conditional probability distribution,Upper and lower bounds,Algorithm,Mutual information,Probabilistic logic,Time complexity,Mathematics
Journal
Volume
Issue
ISSN
23
7
0899-7667
Citations 
PageRank 
References 
1
0.40
13
Authors
2
Name
Order
Citations
PageRank
Nathan D. VanderKraats110.40
Arunava Banerjee231329.18