Title
Probabilistic Recurrent State-Space Models.
Abstract
State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g. LSTMs) proved extremely successful in modeling complex time series data. Fully probabilistic SSMs, however, are often found hard to train, even for smaller problems. To overcome this limitation, we propose a novel model formulation and a scalable training algorithm based on doubly stochastic variational inference and Gaussian processes. In contrast to existing work, the proposed variational approximation allows one to fully capture the latent state temporal correlations. These correlations are the key to robust training. The effectiveness of the proposed PR-SSM is evaluated on a set of real-world benchmark datasets in comparison to state-of-the-art probabilistic model learning methods. Scalability and robustness are demonstrated on a high dimensional problem.
Year
Venue
Field
2018
international conference on machine learning
Time series,Inference,Algorithm,Robustness (computer science),Artificial intelligence,Gaussian process,Statistical model,Probabilistic logic,System identification,State space,Machine learning,Mathematics
DocType
Citations 
PageRank 
Conference
3
0.37
References 
Authors
12
7
Name
Order
Citations
PageRank
Andreas Doerr192.53
Daniel, Christian21158.61
Martin Schiegg3535.04
duy nguyentuong443826.22
Stefan Schaal56081530.10
marc toussaint6129997.23
Sebastian Trimpe719419.26