Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Global well-posedness to stochastic reaction-diffusion equations on the real line $\mathbb{R}$ with superlinear drifts driven by multiplicative space-time white noise (2106.02879v1)

Published 5 Jun 2021 in math.PR

Abstract: Consider the stochastic reaction-diffusion equation with logarithmic nonlinearity driven by space-time white noise: \begin{align}\label{1.a} \left{ \begin{aligned} & \mathrm{d}u(t,x) = \frac{1}{2}\Delta u(t,x)\,\mathrm{d}t+ b(u(t,x)) \,\mathrm{d}t \nonumber\ & ~~~~~~~~~~~~~~~~ + \sigma(u(t,x)) \,W(\mathrm{d}t,\mathrm{d}x), \ t>0, x\in I , \ & u(0,x)=u_0(x), \quad x\in I .\nonumber \end{aligned} \right. \end{align} When $I$ is a compact interval, say $I=[0,1]$, the well-posedness of the above equation was established in DKZ. The case where $I=\mathbb{R}$ was left open. The essential obstacle is caused by the explosion of the supremum norm of the solution, $\sup_{x\in\mathbb{R}}|u(t,x)|=\infty$, making the usual truncation procedure invalid. In this paper, we prove that there exists a unique global solution to the stochastic reaction-diffusion equation on the whole real line $\mathbb{R}$ with logarithmic nonlinearity. Because of the nature of the nonlinearity, to get the uniqueness, we are forced to work with the first order moment of the solutions on the space $C_{tem}(\mathbb{R})$ with a specially designed norm $$\sup_{t\leq T, x\in\mathbb{R}}\left(|u(t,x)|e{-\lambda |x|e{\beta t}}\right),$$ where, unlike the usual norm in $C_{tem}(\mathbb{R})$, the exponent also depends on time $t$ in a particular way. Our approach depends heavily on the new, precise lower order moment estimates of the stochastic convolution and a new type of Gronwall's inequalities we obtained, which are of interest on their own right.

Summary

We haven't generated a summary for this paper yet.