Title
Informational Divergence Approximations to Product Distributions
Abstract
The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verdú on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.
Year
DOI
Venue
2013
10.1109/CWIT.2013.6621596
CWIT
Field
DocType
Volume
Applied mathematics,Computer vision,Mathematical optimization,Product distribution,Divergence,Differential entropy,Mutual information,Artificial intelligence,Total correlation,Mathematics
Journal
abs/1302.0215
Citations 
PageRank 
References 
11
0.86
3
Authors
2
Name
Order
Citations
PageRank
Jie Hou1974.39
Gerhard Kramer244534.21