Title
Linear Complexity Entropy Regions
Abstract
A great many problems in network information theory, both fundamental and applied, involve determining a minimal set of inequalities linking the Shannon entropies of a certain collection of subsets of random variables. In principle this minimal set of inequalities could be determined from the entropy region, whose dimensions are all the subsets of random variables, by projecting out dimensions corresponding to the irrelevant subsets. As a general solution technique, however, this method is plagued both by the incompletely known nature of the entropy region as well as the exponential complexity of its bounds. Even worse, for four or more random variables, it is known that the set of linear information inequalities necessary to completely describe the entropy region must be uncountably infinite. A natural question arises then, if there are certain nontrivial collections of subsets where the inequalities linking only these subsets is both completely known, and have inequality descriptions that are linear in the number of random variables. This paper answers this question in the affirmative. A decomposition expressing the collection of inequalities linking a larger collection of subsets from that of smaller collections of subsets is first proven. This decomposition is then used to provide systems of subsets for which it both exhaustively determines the complete list of inequalities, which is linear in the number of variables.
Year
DOI
Venue
2021
10.1109/ISIT45174.2021.9518030
2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)
Keywords
DocType
Citations 
information inequalities, complexity of entropy region
Conference
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Yirui Liu100.68
John MacLaren Walsh210717.90