Title
Information Decomposition and Synergy
Abstract
Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians.
Year
DOI
Venue
2015
10.3390/e17053501
ENTROPY
Keywords
Field
DocType
Shannon information,mutual information,information decomposition,shared information,synergy
Random variable,Computer science,Exponential family,Theoretical computer science,Multivariate mutual information,Variation of information,Artificial intelligence,Mutual information,Interaction information,Pointwise mutual information,Entropy (information theory),Machine learning
Journal
Volume
Issue
ISSN
17
5
1099-4300
Citations 
PageRank 
References 
21
0.96
12
Authors
3
Name
Order
Citations
PageRank
Eckehard Olbrich113516.51
Nils Bertschinger222521.10
Johannes Rauh315216.63