Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transfer Learning Enhanced Full Waveform Inversion (2302.11259v2)

Published 22 Feb 2023 in cs.LG, physics.comp-ph, and physics.geo-ph

Abstract: We propose a way to favorably employ neural networks in the field of non-destructive testing using Full Waveform Inversion (FWI). The presented methodology discretizes the unknown material distribution in the domain with a neural network within an adjoint optimization. To further increase efficiency of the FWI, pretrained neural networks are used to provide a good starting point for the inversion. This reduces the number of iterations in the Full Waveform Inversion for specific, yet generalizable settings.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. T. Bürchner, P. Kopp, S. Kollmannsberger, and E. Rank, “Immersed boundary parametrizations for full waveform inversion,” Computer Methods in Applied Mechanics and Engineering, vol. 406, p. 115893, Mar. 2023. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0045782523000166
  2. M. Alfarraj and G. AlRegib, “Petrophysical-property estimation from seismic data using recurrent neural networks,” in SEG Technical Program Expanded Abstracts 2018.   Anaheim, California: Society of Exploration Geophysicists, Aug. 2018, pp. 2141–2146. [Online]. Available: https://library.seg.org/doi/10.1190/segam2018-2995752.1
  3. G. Fabien-Ouellet and R. Sarkar, “Seismic velocity estimation: A deep recurrent neural-network approach,” GEOPHYSICS, vol. 85, no. 1, pp. U21–U29, Jan. 2020. [Online]. Available: https://library.seg.org/doi/10.1190/geo2018-0786.1
  4. A. Adler, M. Araya-Polo, and T. Poggio, “Deep Recurrent Architectures for Seismic Tomography,” in 81st EAGE Conference and Exhibition 2019, 2019, pp. 1–5, arXiv:1908.07824 [physics]. [Online]. Available: http://arxiv.org/abs/1908.07824
  5. F. Yang and J. Ma, “Deep-learning inversion: A next-generation seismic velocity model building method,” GEOPHYSICS, vol. 84, no. 4, pp. R583–R599, July 2019. [Online]. Available: https://library.seg.org/doi/10.1190/geo2018-0249.1
  6. W. Wang and J. Ma, “Velocity model building in a crosswell acquisition geometry with image-trained artificial neural networks,” GEOPHYSICS, vol. 85, no. 2, pp. U31–U46, Mar. 2020. [Online]. Available: https://library.seg.org/doi/10.1190/geo2018-0591.1
  7. Y. Wu, Y. Lin, and Z. Zhou, “Inversionet: Accurate and efficient seismic-waveform inversion with convolutional neural networks,” in SEG Technical Program Expanded Abstracts 2018.   Anaheim, California: Society of Exploration Geophysicists, Aug. 2018, pp. 2096–2100. [Online]. Available: https://library.seg.org/doi/10.1190/segam2018-2998603.1
  8. W. Wang, F. Yang, and J. Ma, “Velocity model building with a modified fully convolutional network,” in SEG Technical Program Expanded Abstracts 2018.   Anaheim, California: Society of Exploration Geophysicists, Aug. 2018, pp. 2086–2090. [Online]. Available: https://library.seg.org/doi/10.1190/segam2018-2997566.1
  9. S. Li, B. Liu, Y. Ren, Y. Chen, S. Yang, Y. Wang, and P. Jiang, “Deep-Learning Inversion of Seismic Data,” IEEE Transactions on Geoscience and Remote Sensing, vol. 58, no. 3, pp. 2135–2149, Mar. 2020.
  10. Y. Zheng, Q. Zhang, A. Yusifov, and Y. Shi, “Applications of supervised deep learning for seismic interpretation and inversion,” The Leading Edge, vol. 38, no. 7, pp. 526–533, July 2019. [Online]. Available: https://library.seg.org/doi/10.1190/tle38070526.1
  11. M. Araya-Polo, S. Farris, and M. Florez, “Deep learning-driven velocity model building workflow,” The Leading Edge, vol. 38, no. 11, pp. 872a1–872a9, Nov. 2019. [Online]. Available: https://library.seg.org/doi/10.1190/tle38110872a1.1
  12. B. Mao, L.-G. Han, Q. Feng, and Y.-C. Yin, “Subsurface velocity inversion from deep learning-based data assimilation,” Journal of Applied Geophysics, vol. 167, pp. 172–179, Aug. 2019. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0926985118308905
  13. B. Wu, D. Meng, L. Wang, N. Liu, and Y. Wang, “Seismic Impedance Inversion Using Fully Convolutional Residual Network and Transfer Learning,” IEEE Geoscience and Remote Sensing Letters, vol. 17, no. 12, pp. 2140–2144, Dec. 2020.
  14. V. Das, A. Pollack, U. Wollner, and T. Mukerji, “Convolutional neural network for seismic impedance inversion,” GEOPHYSICS, vol. 84, no. 6, pp. R869–R880, Nov. 2019. [Online]. Available: https://library.seg.org/doi/10.1190/geo2018-0838.1
  15. M. J. Park and M. D. Sacchi, “Automatic velocity analysis using convolutional neural network and transfer learning,” GEOPHYSICS, vol. 85, no. 1, pp. V33–V43, Jan. 2020. [Online]. Available: https://library.seg.org/doi/10.1190/geo2018-0870.1
  16. M. Araya-Polo, J. Jennings, A. Adler, and T. Dahlke, “Deep-learning tomography,” The Leading Edge, vol. 37, no. 1, pp. 58–66, Jan. 2018. [Online]. Available: https://library.seg.org/doi/10.1190/tle37010058.1
  17. Y. Kim and N. Nakata, “Geophysical inversion versus machine learning in inverse problems,” The Leading Edge, vol. 37, no. 12, pp. 894–901, Dec. 2018. [Online]. Available: https://library.seg.org/doi/10.1190/tle37120894.1
  18. J. Rao, F. Yang, H. Mo, S. Kollmannsberger, and E. Rank, “Quantitative reconstruction of defects in multi-layered bonded composites using fully convolutional network-based ultrasonic inversion,” Journal of Sound and Vibration, vol. 542, p. 117418, 2023.
  19. J. Ye and N. Toyama, “Automatic defect detection for ultrasonic wave propagation imaging method using spatio-temporal convolution neural networks,” Structural Health Monitoring, vol. 21, no. 6, pp. 2750–2767, Nov. 2022.
  20. D. C. Psichogios and L. H. Ungar, “A hybrid neural network-first principles approach to process modeling,” AIChE Journal, vol. 38, no. 10, pp. 1499–1511, Oct. 1992. [Online]. Available: https://onlinelibrary.wiley.com/doi/10.1002/aic.690381003
  21. I. Lagaris, A. Likas, and D. Fotiadis, “Artificial neural networks for solving ordinary and partial differential equations,” IEEE Transactions on Neural Networks, vol. 9, no. 5, pp. 987–1000, Sept. 1998.
  22. M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,” Journal of Computational Physics, vol. 378, pp. 686–707, Feb. 2019. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0021999118307125
  23. M. Rasht-Behesht, C. Huber, K. Shukla, and G. E. Karniadakis, “Physics-informed neural networks (pinns) for wave propagation and full waveform inversions,” Journal of Geophysical Research: Solid Earth, vol. 127, no. 5, p. e2021JB023120, 2022, e2021JB023120 2021JB023120. [Online]. Available: https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2021JB023120
  24. J. Berg and K. Nyström, “Neural networks as smooth priors for inverse problems for PDEs,” Journal of Computational Mathematics and Data Science, vol. 1, p. 100008, Sept. 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S2772415821000043
  25. K. Xu and E. Darve, “The Neural Network Approach to Inverse Problems in Differential Equations,” Jan. 2019, arXiv:1901.07758 [math]. [Online]. Available: http://arxiv.org/abs/1901.07758
  26. L. Herrmann, T. Bürchner, F. Dietrich, and S. Kollmannsberger, “On the use of neural networks for full waveform inversion,” arXiv:submit/4714244, Feb. 2023.
  27. S. J. Pan and Q. Yang, “A Survey on Transfer Learning,” IEEE Transactions on Knowledge and Data Engineering, vol. 22, no. 10, pp. 1345–1359, Oct. 2010.
  28. N. Kalchbrenner and P. Blunsom, “Recurrent continuous translation models,” in Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing.   Seattle, Washington, USA: Association for Computational Linguistics, Oct. 2013, pp. 1700–1709. [Online]. Available: https://aclanthology.org/D13-1176
  29. I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in Neural Information Processing Systems, Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K. Weinberger, Eds., vol. 27.   Curran Associates, Inc., 2014.
  30. K. He, X. Zhang, S. Ren, and J. Sun, “Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification,” Feb. 2015, arXiv:1502.01852 [cs] version: 1. [Online]. Available: http://arxiv.org/abs/1502.01852
  31. A. D. Jagtap, K. Kawaguchi, and G. E. Karniadakis, “Adaptive activation functions accelerate convergence in deep and physics-informed neural networks,” Journal of Computational Physics, vol. 404, p. 109136, Mar. 2020. [Online]. Available: https://linkinghub.elsevier.com/retrieve/pii/S0021999119308411
  32. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
  33. S. K. Kumar, “On weight initialization in deep neural networks,” arXiv preprint arXiv:1704.08863, 2017.
  34. A. Wirgin, “The inverse crime,” Jan. 2004, arXiv:math-ph/0401050. [Online]. Available: http://arxiv.org/abs/math-ph/0401050
  35. E. Zhang and Y. Zhang, “Average Precision,” in Encyclopedia of Database Systems, L. LIU and M. T. ÖZSU, Eds.   Boston, MA: Springer US, 2009, pp. 192–193. [Online]. Available: https://doi.org/10.1007/978-0-387-39940-9_482
  36. H. Li, Z. Xu, G. Taylor, C. Studer, and T. Goldstein, “Visualizing the Loss Landscape of Neural Nets,” Nov. 2018, arXiv:1712.09913 [cs, stat]. [Online]. Available: http://arxiv.org/abs/1712.09913
  37. F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay, “Scikit-learn: Machine learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
Citations (5)

Summary

We haven't generated a summary for this paper yet.