Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic Recurrent State-Space Models (1801.10395v2)

Published 31 Jan 2018 in stat.ML

Abstract: State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g. LSTMs) proved extremely successful in modeling complex time series data. Fully probabilistic SSMs, however, are often found hard to train, even for smaller problems. To overcome this limitation, we propose a novel model formulation and a scalable training algorithm based on doubly stochastic variational inference and Gaussian processes. In contrast to existing work, the proposed variational approximation allows one to fully capture the latent state temporal correlations. These correlations are the key to robust training. The effectiveness of the proposed PR-SSM is evaluated on a set of real-world benchmark datasets in comparison to state-of-the-art probabilistic model learning methods. Scalability and robustness are demonstrated on a high dimensional problem.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Andreas Doerr (6 papers)
  2. Christian Daniel (6 papers)
  3. Martin Schiegg (4 papers)
  4. Duy Nguyen-Tuong (5 papers)
  5. Stefan Schaal (73 papers)
  6. Marc Toussaint (87 papers)
  7. Sebastian Trimpe (111 papers)
Citations (115)

Summary

We haven't generated a summary for this paper yet.