Mean-square approximations of Lévy noise driven SDEs with super-linearly growing diffusion and jump coefficients
Abstract: This paper first establishes a fundamental mean-square convergence theorem for general one-step numerical approximations of L\'{e}vy noise driven stochastic differential equations with non-globally Lipschitz coefficients. Then two novel explicit schemes are designed and their convergence rates are exactly identified via the fundamental theorem. Different from existing works, we do not impose a globally Lipschitz condition on the jump coefficient but formulate appropriate assumptions to allow for its super-linear growth. However, we require that the L\'{e}vy measure is finite. New arguments are developed to handle essential difficulties in the convergence analysis, caused by the super-linear growth of the jump coefficient and the fact that higher moment bounds of the Poisson increments $ \int_t{t+h} \int_Z \,\bar{N}(\mbox{d}s,\mbox{d}z), t \geq 0, h >0$ contribute to magnitude not more than $O(h)$. Numerical results are finally reported to confirm the theoretical findings.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.