Title
Mutual information of several random variables and its estimation via variation
Abstract
We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1---3] to the multivariate case.
Year
DOI
Venue
2009
10.1134/S0032946009040012
Problems of Information Transmission
Keywords
Field
DocType
corresponding result,multivariate case,marginal distribution,random variable,mutual information,joint distribution,asymptotically optimal,variational distance,lower bound,probability distribution
Convergence of random variables,Combinatorics,Algebra of random variables,Joint probability distribution,Multivariate random variable,Mutual information,Statistical distance,Sum of normally distributed random variables,Marginal distribution,Mathematics
Journal
Volume
Issue
ISSN
45
4
0032-9460
Citations 
PageRank 
References 
3
0.48
6
Authors
1
Name
Order
Citations
PageRank
Vyacheslav V. Prelov114529.59