Title
Mutual Information Bounds via Adjacency Events.
Abstract
The mutual information between two jointly distributed random variables $X$ and $Y$ is a functional of the joint distribution $P_{XY}$ , which is sometimes difficult to handle or estimate. A coarser description of the statistical behavior of $(X,Y)$ is given by the marginal distributions $P_{X}, P_{Y}$ and the adjacency relation induced by the joint distribution, where $x$ and $y$ are adjacent if $P(x,y)>0$ . We derive a lower bound on the mutual information in terms of these entities. The bound is obtained by viewing the channel from $X$ to $Y$ as a probability distribution on a set of possible actions, where an action determines the output for any possible input, and is independently drawn. We also provide an alternative proof based on convex optimization that yields a generally tighter bound. Finally, we derive an upper bound on the mutual information in terms of adjacency events between the action and the pair $(X,Y)$ , where in this case, an action $a$ and a pair $(x,y)$ are adjacent if $y=a(x)$ . As an example, we apply our bounds to the binary deletion channel and show that for the special case of an independent identically distributed input distribution and a range of deletion probabilities, our lower and upper bounds both outperform the best known bounds for the mutual information.
Year
DOI
Venue
2016
10.1109/TIT.2016.2609390
IEEE Trans. Information Theory
Keywords
Field
DocType
Mutual information,Uncertainty,Entropy,Upper bound,Probability distribution,Convex functions,Random variables
Discrete mathematics,Combinatorics,Joint probability distribution,Upper and lower bounds,Probability distribution,Convex function,Mutual information,Deletion channel,Conditional mutual information,Marginal distribution,Mathematics
Journal
Volume
Issue
ISSN
62
11
0018-9448
Citations 
PageRank 
References 
1
0.36
5
Authors
3
Name
Order
Citations
PageRank
Yanjun Han19813.49
Or Ordentlich212118.37
Ofer Shayevitz315826.41