Title
MINE: Mutual Information Neural Estimation.
Abstract
This paper presents a Mutual Information Neural Estimator (MINE) that is linearly scalable in dimensionality as well as in sample size. MINE is back-propable and we prove that it is strongly consistent. We illustrate a handful of applications in which MINE is succesfully applied to enhance the property of generative models in both unsupervised and supervised settings. We apply our framework to estimate the information bottleneck, and apply it in tasks related to supervised classification problems. Our results demonstrate substantial added flexibility and improvement in these settings.
Year
Venue
DocType
2018
arXiv: Learning
Journal
Volume
Citations 
PageRank 
abs/1801.04062
6
0.42
References 
Authors
0
5
Name
Order
Citations
PageRank
Ishmael Belghazi11837.57
Sai Rajeswar2554.37
Aristide Baratin3252.69
Devon Hjelm4282.23
Aaron C. Courville56671348.46