Abstract | ||
---|---|---|
In this paper we propose a model that combines the strengths of RNNs and SGVB: the Variational Recurrent Auto-Encoder (VRAE). Such a model can be used for efficient, large scale unsupervised learning on time series data, mapping the time series data to a latent vector representation. The model is generative, such that data can be generated from samples of the latent space. An important contribution of this work is that the model can make use of unlabeled data in order to facilitate supervised training of RNNs by initialising the weights and network state. |
Year | Venue | Field |
---|---|---|
2014 | CoRR | Time series,Computer science,Auto encoders,Unsupervised learning,Artificial intelligence,Supervised training,Generative grammar,Machine learning |
DocType | Volume | Citations |
Journal | abs/1412.6581 | 25 |
PageRank | References | Authors |
1.42 | 8 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Otto Fabius | 1 | 25 | 1.42 |
Joost R. van Amersfoort | 2 | 35 | 1.95 |
Diederik P. Kingma | 3 | 8013 | 263.16 |