Title
Learning Non-deterministic Representations with Energy-based Ensembles.
Abstract
The goal of a generative model is to capture the distribution underlying the data, typically through latent variables. After training, these variables are often used as a new representation, more effective than the original features in a variety of learning tasks. However, the representations constructed by contemporary generative models are usually point-wise deterministic mappings from the original feature space. Thus, even with representations robust to class-specific transformations, statistically driven models trained on them would not be able to generalize when the labeled data is scarce. Inspired by the stochasticity of the synaptic connections in the brain, we introduce Energy-based Stochastic Ensembles. These ensembles can learn non-deterministic representations, i.e., mappings from the feature space to a family of distributions in the latent space. These mappings are encoded in a distribution over a (possibly infinite) collection of models. By conditionally sampling models from the ensemble, we obtain multiple representations for every input example and effectively augment the data. We propose an algorithm similar to contrastive divergence for training restricted Boltzmann stochastic ensembles. Finally, we demonstrate the concept of the stochastic representations on a synthetic dataset as well as test them in the one-shot learning scenario on MNIST.
Year
Venue
Field
2014
international conference on learning representations
Feature vector,MNIST database,Theoretical computer science,Latent variable,Sampling (statistics),Artificial intelligence,Labeled data,Contrastive divergence,Generative grammar,Mathematics,Machine learning,Generative model
DocType
Volume
Citations 
Journal
abs/1412.7272
0
PageRank 
References 
Authors
0.34
9
3
Name
Order
Citations
PageRank
Maruan Al-Shedivat1969.97
Emre Neftci218317.52
Gert Cauwenberghs31262167.20