One-parameter statistical model for linear stochastic differential equation with time delay (1510.04115v1)
Abstract: Assume that we observe a stochastic process $(X(t)){t\in[-r,T]}$, which satisfies the linear stochastic delay differential equation [ \mathrm{d} X(t) = \vartheta \int{[-r,0]} X(t + u) \, a(\mathrm{d} u) \, \mathrm{d} t + \mathrm{d} W(t) , \qquad t \geq 0 , ] where $a$ is a finite signed measure on $[-r, 0]$. The local asymptotic properties of the likelihood function are studied. Local asymptotic normality is proved in case of $v_\vartheta* < 0$, local asymptotic quadraticity is shown if $v_\vartheta* = 0$, and, under some additional conditions, local asymptotic mixed normality or periodic local asymptotic mixed normality is valid if $v_\vartheta* > 0$, where $v_\vartheta*$ is an appropriately defined quantity. As an application, the asymptotic behaviour of the maximum likelihood estimator $\widehat{\vartheta}T$ of $\vartheta$ based on $(X(t)){t\in[-r,T]}$ can be derived as $T \to \infty$.
Collections
Sign up for free to add this paper to one or more collections.