Cold-Diffusion Driven Downward Continuation of Gravity Data
Abstract: Gravity data can be better interpreted after enhancing high-frequency information via downward continuation. Downward continuation is an ill-posed deconvolution problem. It has been tackled using regularization techniques, which are sensitive to the choice of regularization parameters. More recently, convolutional neural networks such as the U-Net have been trained using synthetic data to potentially learn prior information and perform deconvolution without the need to adjust the regularization parameters. Our experiments reveal that the U-Net is highly sensitive to correlated noise, which is ubiquitously present in geophysical field data. In this paper, we develop a framework based on the $\textbf{cold-diffusion model}$ using the exponential kernel associated with downward continuation. The exponential form of the kernel allows us to train the U-Net to tackle multiple concurrent deconvolution problems with varying levels of blur. This allows our framework to be more robust and quantitatively outperform traditional U-Net-based approaches. The performances also closely matches that of $\textbf{oracle}$ Tikhonov reconstruction technique, which has access to the ground truth.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.