Title
Deep Variational Canonical Correlation Analysis.
Abstract
We present deep variational canonical correlation analysis (VCCA), a deep multi-view learning model that extends the latent variable model interpretation of linear CCA~citep{BachJordan05a} to nonlinear observation models parameterized by deep neural networks (DNNs). Computing the marginal data likelihood, as well as inference of the latent variables, are intractable under this model. We derive a variational lower bound of the data likelihood by parameterizing the posterior density of the latent variables with another DNN, and approximate the lower bound via Monte Carlo sampling. Interestingly, the resulting model resembles that of multi-view autoencoders~citep{Ngiam_11b}, with the key distinction of an additional sampling procedure at the bottleneck layer. We also propose a variant of VCCA called VCCA-private which can, in addition to the ``common variablesu0027u0027 underlying both views, extract the ``private variablesu0027u0027 within each view. We demonstrate that VCCA-private is able to disentangle the shared and private information for multi-view data without hard supervision.
Year
Venue
Field
2016
arXiv: Learning
Monte Carlo method,Parameterized complexity,Upper and lower bounds,Canonical correlation,Latent variable model,Latent class model,Latent variable,Sampling (statistics),Artificial intelligence,Mathematics,Machine learning
DocType
Volume
Citations 
Journal
abs/1610.03454
13
PageRank 
References 
Authors
0.60
18
3
Name
Order
Citations
PageRank
Weiran Wang11149.99
Honglak Lee26247398.39
Karen Livescu3125471.43