Title
Information Potential Auto-Encoders.
Abstract
In this paper, we suggest a framework to make use of mutual information as a regularization criterion to train Auto-Encoders (AEs). In the proposed framework, AEs are regularized by minimization of the mutual information between input and encoding variables of AEs during the training phase. In order to estimate the entropy of the encoding variables and the mutual information, we propose a non-parametric method. We also give an information theoretic view of Variational AEs (VAEs), which suggests that VAEs can be considered as parametric methods that estimate entropy. Experimental results show that the proposed non-parametric models have more degree of freedom in terms of representation learning of features drawn from complex distributions such as Mixture of Gaussians, compared to methods which estimate entropy using parametric approaches, such as Variational AEs.
Year
Venue
DocType
2017
CoRR
Journal
Volume
Citations 
PageRank 
abs/1706.04635
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Yan Zhang1777123.70
Mete Ozay201.35
Zhun Sun3123.49
Takayuki Okatani449250.10