Seismic resolution enhancement via deep Learning with Knowledge Distillation and Domain Adaptation (2506.22018v1)
Abstract: High-resolution processing of seismic signals is crucial for subsurface geological characterization and thin-layer reservoir identification. Traditional high-resolution algorithms can partially recover high-frequency information but often lack robustness, computational efficiency, and consideration of inter-trace structural relationships. Many deep learning methods use end-to-end architectures that do not incorporate prior knowledge or address data domain disparities, leading to limited generalization.To overcome these challenges, this paper presents the Domain-Adaptive Knowledge Distillation Network (DAKD-Net), which integrates a knowledge distillation strategy with a domain adaptation mechanism for high-resolution seismic data processing. Trained on datasets from forward modeling, DAKD-Net establishes physical relationships between low and high-resolution data, extracting high-frequency prior knowledge during a guided phase before detail restoration without prior conditions. Domain adaptation enhances the model's generalization to real seismic data, improving both generalization capability and structural expression accuracy.DAKD-Net employs a U-Net backbone to extract spatial structural information from multi-trace seismic profiles. The knowledge distillation mechanism enables prior knowledge transfer, allowing recovery of high-resolution data directly from low-resolution inputs. Domain-adaptive fine-tuning further enhances the network's performance in actual survey areas. Experimental results show that DAKD-Net outperforms traditional methods and classical deep networks in longitudinal resolution and complex structural detail restoration, demonstrating strong robustness and practicality.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.