Title
Sifting Common Information from Many Variables.
Abstract
Measuring the relationship between any two variables is a rich and active area of research at the core of the scientific enterprise. In contrast, characterizing the common information among a group of observed variables has remained a speculative undertaking producing no practical methods for high-dimensional data. A promising solution would be a multivariate generalization of the famous Wyner common information, but this approach relies on solving an apparently intractable optimization problem. We formulate an incremental version of this problem called the information sieve that not only admits a simple fixed-point solution, but also empirically exhibits an exponential rate of convergence. We use this scalable method to demonstrate that common information is a useful concept for machine learning. The sieve outperforms standard methods on dimensionality reduction tasks, solves a blind source separation problem involving Gaussian sources that cannot be solved with ICA, and accurately recovers structure in brain imaging data.
Year
DOI
Venue
2017
10.24963/ijcai.2017/402
IJCAI
DocType
Volume
Citations 
Conference
abs/1606.02307
2
PageRank 
References 
Authors
0.38
16
4
Name
Order
Citations
PageRank
Greg Ver Steeg124332.99
Shuyang Gao2275.23
Kyle Reing351.91
Aram Galstyan4103394.05