Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 43 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 20 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 180 tok/s Pro
GPT OSS 120B 443 tok/s Pro
Claude Sonnet 4.5 32 tok/s Pro
2000 character limit reached

An effective self-supervised learning method for various seismic noise attenuation (2311.02193v1)

Published 3 Nov 2023 in physics.geo-ph

Abstract: Faced with the scarcity of clean label data in real scenarios, seismic denoising methods based on supervised learning (SL) often encounter performance limitations. Specifically, when a model trained on synthetic data is directly applied to field data, its performance would drastically decline due to significant differences in feature distributions between the two. To address this challenge, we develop an effective self-supervised strategy. This strategy, while relying on a single denoising network model, adeptly attenuates various types of seismic noise. The strategy comprises two main phases: 1. The warm-up phase. By using prior knowledge or extracting information from real data, we introduce additional noise to the original noisy data, constructing a noisier data with intensified noise. This data serves as the input, with the original noisy data acting as pseudo-labels. This facilitates rapid pre-training of the network to capture a certain noise characteristics and boosts network stability, setting the stage for the subsequent phase. 2. Iterative data refinement (IDR) phase. During this phase, we use the predictions of the original noisy data from the network trained in the previous epoch as the pseudo-labels. We continue to add noise to the predictions, creating a new noisier-noisy dataset for the current epoch of network training. Through this iterative process, we progressively reduce the discrepancy between the original noisy data and the desired clean data. Ultimately, the network's predictions on the original noisy data become our denoised results. Validations under scenarios with random noise, backscattered noise, and blending noise reveal that our method not only matches the traditional SL techniques on synthetic data but significantly outperforms them on field data.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube