Papers
Topics
Authors
Recent
Search
2000 character limit reached

Learning the conditional law: signatures and conditional GANs in filtering and prediction of diffusion processes

Published 1 Apr 2022 in stat.ML, cs.LG, and math.OC | (2204.00611v2)

Abstract: We consider the filtering and prediction problem for a diffusion process. The signal and observation are modeled by stochastic differential equations (SDEs) driven by correlated Wiener processes. In classical estimation theory, measure-valued stochastic partial differential equations (SPDEs) are derived for the filtering and prediction measures. These equations can be hard to solve numerically. We provide an approximation algorithm using conditional generative adversarial networks (GANs) in combination with signatures, an object from rough path theory. The signature of a sufficiently smooth path determines the path completely. As a result, in some cases, GANs based on signatures have been shown to efficiently approximate the law of a stochastic process. For our algorithm we extend this method to sample from the conditional law, given noisy, partial observation. Our generator is constructed using neural differential equations (NDEs), relying on their universal approximator property. We show well-posedness in providing a rigorous mathematical framework. Numerical results show the efficiency of our algorithm.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.