Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence and stability of a micro-macro acceleration method:linear slow-fast stochastic differential equations with additive noise (1901.07405v1)

Published 22 Jan 2019 in math.NA and cs.NA

Abstract: We analyse the convergence and stability of a micro-macro acceleration algorithm for Monte Carlo simulations of stiff stochastic differential equations with a time-scale separation between the fast evolution of the individual stochastic realizations and some slow macroscopic state variables of the process. The micro-macro acceleration method performs a short simulation of a large ensemble of individual fast paths, before extrapolating the macroscopic state variables of interest over a larger time step. After extrapolation, the method constructs a new probability distribution that is consistent with the extrapolated macroscopic state variables, while minimizing Kullback-Leibler divergence with respect to the distribution available at the end of the Monte Carlo simulation. In the current work, we study the convergence and stability of this method on linear stochastic differential equations with additive noise, when only extrapolating the mean of the slow component. For this case, we prove convergence to the microscopic dynamics when the initial distribution is Gaussian and present a stability result for non-Gaussian initial laws.

Summary

We haven't generated a summary for this paper yet.