Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning the conditional law: signatures and conditional GANs in filtering and prediction of diffusion processes (2204.00611v2)

Published 1 Apr 2022 in stat.ML, cs.LG, and math.OC

Abstract: We consider the filtering and prediction problem for a diffusion process. The signal and observation are modeled by stochastic differential equations (SDEs) driven by correlated Wiener processes. In classical estimation theory, measure-valued stochastic partial differential equations (SPDEs) are derived for the filtering and prediction measures. These equations can be hard to solve numerically. We provide an approximation algorithm using conditional generative adversarial networks (GANs) in combination with signatures, an object from rough path theory. The signature of a sufficiently smooth path determines the path completely. As a result, in some cases, GANs based on signatures have been shown to efficiently approximate the law of a stochastic process. For our algorithm we extend this method to sample from the conditional law, given noisy, partial observation. Our generator is constructed using neural differential equations (NDEs), relying on their universal approximator property. We show well-posedness in providing a rigorous mathematical framework. Numerical results show the efficiency of our algorithm.

Summary

We haven't generated a summary for this paper yet.