Title
On Relationship between Mutual Information and Variation
Abstract
Investigation of a relationship between the mutual information and variational distance, started in Pinsker paper [1], where an upper bound for the mutual information via variational distance was obtained, is here continued. We present a simple lower bound which is optimal or asymptotically optimal in some cases. An uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the case where the variational distance tends to zero or to its maximum value.
Year
DOI
Venue
2007
10.1109/ISIT.2007.4557078
2007 IEEE International Symposium on Information Theory
Keywords
Field
DocType
mutual information,variational distance,random variables,asymptotic behaviour,entropy
Combinatorics,Quantum mutual information,Upper and lower bounds,Distance correlation,Variation of information,Mutual information,Conditional entropy,Total correlation,Conditional mutual information,Mathematics
Conference
ISSN
ISBN
Citations 
2157-8095
978-1-4244-1397-3
0
PageRank 
References 
Authors
0.34
1
1
Name
Order
Citations
PageRank
Vyacheslav V. Prelov114529.59