Title
Translation Insensitivity for Deep Convolutional Gaussian Processes.
Abstract
Deep learning has been at the foundation of large improvements in image classification. To improve the robustness of predictions, Bayesian approximations have been used to learn parameters in deep neural networks. We follow an alternative approach, by using Gaussian processes as building blocks for Bayesian deep learning models, which has recently become viable due to advances in inference for convolutional and deep structure. We investigate deep convolutional Gaussian processes, and identify a problem that holds back current performance. To remedy the issue, we introduce a translation insensitive convolutional kernel, which removes the restriction of requiring identical outputs for identical patch inputs. We show empirically that this convolutional kernel improves performances in both shallow and deep models. On MNIST, FASHION-MNIST and CIFAR-10 we improve previous GP models in terms of accuracy, with the addition of having more calibrated predictive probabilities than simple DNN models.
Year
Venue
DocType
2019
arXiv: Machine Learning
Journal
Volume
Citations 
PageRank 
abs/1902.05888
1
0.34
References 
Authors
17
5
Name
Order
Citations
PageRank
Vincent Dutordoir110.34
Mark van der Wilk211.02
Artem Artemev311.36
Marcin Tomczak410.34
James Hensman526520.05