Title
Deep Latent-Variable Kernel Learning
Abstract
Deep kernel learning (DKL) leverages the connection between the Gaussian process (GP) and neural networks (NNs) to build an end-to-end hybrid model. It combines the capability of NN to learn rich representations under massive data and the nonparametric property of GP to achieve automatic regularization that incorporates a tradeoff between model fit and model complexity. However, the deterministic NN encoder may weaken the model regularization of the following GP part, especially on small datasets, due to the free latent representation. We, therefore, present a complete deep latent-variable kernel learning (DLVKL) model wherein the latent variables perform stochastic encoding for regularized representation. We further enhance the DLVKL from two aspects: 1) the expressive variational posterior through neural stochastic differential equation (NSDE) to improve the approximation quality and 2) the hybrid prior taking knowledge from both the SDE prior and the posterior to arrive at a flexible tradeoff. Extensive experiments imply that DLVKL-NSDE performs similar to the well-calibrated GP on small datasets, and shows superiority on large datasets.
Year
DOI
Venue
2022
10.1109/TCYB.2021.3062140
IEEE Transactions on Cybernetics
Keywords
DocType
Volume
Algorithms,Learning,Models, Theoretical,Neural Networks, Computer
Journal
52
Issue
ISSN
Citations 
10
2168-2267
0
PageRank 
References 
Authors
0.34
7
4
Name
Order
Citations
PageRank
Honghai Liu11415.32
Yew-Soon Ong226323.35
Xiaomo Jiang300.34
Xiaofang Wang4367.83