Neural Networks for Tamed Milstein Approximation of SDEs with Additive Symmetric Jump Noise Driven by a Poisson Random Measure (2507.04417v1)
Abstract: This work aims to estimate the drift and diffusion functions in stochastic differential equations (SDEs) driven by a particular class of L\'evy processes with finite jump intensity, using neural networks. We propose a framework that integrates the Tamed-Milstein scheme with neural networks employed as non-parametric function approximators. Estimation is carried out in a non-parametric fashion for the drift function ( f: \mathbb{Z} \to \mathbb{R} ), the diffusion coefficient ( g: \mathbb{Z} \to \mathbb{R} ). The model of interest is given by [ dX(t) = \xi + f(X(t))\, dt + g(X(t))\, dW_t + \gamma \int_{\mathbb{Z}} z\, N(dt,dz), ] where ( W_t ) is a standard Brownian motion, and ( N(dt,dz) ) is a Poisson random measure on ( (~\mathbb{R}{+} ~\times ~\mathbb{Z}~, ~\mathcal{B}~(~\mathbb{R}{+}~)~\otimes~\mathcal{Z}~,~ \lambda( \Lambda~\otimes~v~)~) ), with ( \lambda, \gamma > 0 ), ( \Lambda ) being the Lebesgue measure on ( \mathbb{R}_{+} ), and ( v ) a finite measure on the measurable space ( (\mathbb{Z}, \mathcal{Z}) ). Neural networks are used as non-parametric function approximators, enabling the modeling of complex nonlinear dynamics without assuming restrictive functional forms. The proposed methodology constitutes a flexible alternative for inference in systems with state-dependent noise and discontinuities driven by L\'evy processes.