Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Predictions of Transient Vector Solution Fields with Sequential Deep Operator Network (2311.11500v2)

Published 20 Nov 2023 in cs.CE

Abstract: The Deep Operator Network (DeepONet) structure has shown great potential in approximating complex solution operators with low generalization errors. Recently, a sequential DeepONet (S-DeepONet) was proposed to use sequential learning models in the branch of DeepONet to predict final solutions given time-dependent inputs. In the current work, the S-DeepONet architecture is extended by modifying the information combination mechanism between the branch and trunk networks to simultaneously predict vector solutions with multiple components at multiple time steps of the evolution history, which is the first in the literature using DeepONets. Two example problems, one on transient fluid flow and the other on path-dependent plastic loading, were shown to demonstrate the capabilities of the model to handle different physics problems. The use of a trained S-DeepONet model in inverse parameter identification via the genetic algorithm is shown to demonstrate the application of the model. In almost all cases, the trained model achieved an $R2$ value of above 0.99 and a relative $L_2$ error of less than 10\% with only 3200 training data points, indicating superior accuracy. The vector S-DeepONet model, having only 0.4\% more parameters than a scalar model, can predict two output components simultaneously at an accuracy similar to the two independently trained scalar models with a 20.8\% faster training time. The S-DeepONet inference is at least three orders of magnitude faster than direct numerical simulations, and inverse parameter identifications using the trained model is highly efficient and accurate.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. Inverse-design of nonlinear mechanical metamaterials via video denoising diffusion models. arXiv preprint arXiv:2305.19836, 2023.
  2. Neural networks for topology optimization. Russian Journal of Numerical Analysis and Mathematical Modelling, 34(4):215–223, 2019.
  3. Computational fluid dynamics (cfd), artificial neural network (ann) and genetic algorithm (ga) as a hybrid method for the analysis and optimization of micro-photocatalytic reactors: Nox abatement as a case study. Chemical Engineering Journal, 431:133771, 2022.
  4. Combining differentiable pde solvers and graph neural networks for fluid flow prediction. In international conference on machine learning, pages 2402–2411. PMLR, 2020.
  5. On the locality of local neural operator in learning fluid dynamics. arXiv preprint arXiv:2312.09820, 2023.
  6. Convolutional neural network applications in additive manufacturing: A review. Advances in Industrial and Manufacturing Engineering, 4:100072, 2022.
  7. A deep neural network for classification of melt-pool images in metal additive manufacturing. Journal of Intelligent Manufacturing, 31:375–386, 2020.
  8. Designing impact-resistant bio-inspired low-porosity structures using neural networks. Journal of Materials Research and Technology, 2023.
  9. Exploring the structure-property relations of thin-walled, 2d extruded lattices using neural networks. Computers & Structures, 277:106940, 2023a.
  10. The mixed deep energy method for resolving concentration features in finite strain hyperelasticity. Journal of Computational Physics, 451:110839, 2022.
  11. A deep learning energy-based method for classical elastoplasticity. International Journal of Plasticity, 162:103531, 2023b.
  12. A deep energy method for finite deformation hyperelasticity. European Journal of Mechanics-A/Solids, 80:103874, 2020.
  13. Parametric deep energy approach for elasticity accounting for strain gradient effects. Computer Methods in Applied Mechanics and Engineering, 386:114096, 2021.
  14. Training deep neural networks for the inverse design of nanophotonic structures. Acs Photonics, 5(4):1365–1369, 2018.
  15. Combining a neural network with a genetic algorithm for process parameter optimization. Engineering applications of artificial intelligence, 13(4):391–396, 2000.
  16. Ling Wang. A hybrid genetic algorithm–neural network strategy for simulation optimization. Applied Mathematics and Computation, 170(2):1329–1343, 2005.
  17. Deep learning in computational mechanics: a review. Computational Mechanics, Jan 2024. ISSN 1432-0924. doi: 10.1007/s00466-023-02434-4. URL https://doi.org/10.1007/s00466-023-02434-4.
  18. Neural operator: Learning maps between function spaces with applications to pdes. Journal of Machine Learning Research, 24(89):1–97, 2023.
  19. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
  20. Fourier neural operator with learned deformations for pdes on general geometries. arXiv preprint arXiv:2207.05209, 2022a.
  21. Learning deep implicit fourier neural operators (ifnos) with applications to heterogeneous material modeling. Computer Methods in Applied Mechanics and Engineering, 398:115296, 2022.
  22. Solving seismic wave equations on variable velocity models with fourier neural operator. IEEE Transactions on Geoscience and Remote Sensing, 61:1–18, 2023a.
  23. Fourier neural operator approach to large eddy simulation of three-dimensional turbulence. Theoretical and Applied Mechanics Letters, 12(6):100389, 2022b.
  24. Learning nonlinear operators via deeponet based on the universal approximation theorem of operators. Nature machine intelligence, 3(3):218–229, 2021a.
  25. Deep learning operator network for plastic deformation with variable loads and material properties. Engineering with Computers, pages 1–13, 2023.
  26. Multifidelity deep neural operators for efficient learning of partial differential equations with application to fast inverse design of nanoscale heat transport. Physical Review Research, 4(2):023210, 2022a.
  27. Novel deeponet architecture to predict stresses in elastoplastic structures with variable complex geometries and loads. Computer Methods in Applied Mechanics and Engineering, 415:116277, 2023c. ISSN 0045-7825. doi: https://doi.org/10.1016/j.cma.2023.116277. URL https://www.sciencedirect.com/science/article/pii/S0045782523004012.
  28. A novel deeponet model for learning moving-solution operators with applications to earthquake hypocenter localization. arXiv preprint arXiv:2306.04096, 2023.
  29. Data-driven and physics-informed deep learning operators for solution of heat conduction equation with parametric heat source. International Journal of Heat and Mass Transfer, 203:123809, 2023.
  30. Phase-field deeponet: Physics-informed deep operator neural network for fast simulations of pattern formation governed by gradient flows of free-energy functionals. arXiv preprint arXiv:2302.13368, 2023b.
  31. A comprehensive and fair comparison of two neural operators (with practical extensions) based on fair data. Computer Methods in Applied Mechanics and Engineering, 393:114778, 2022b.
  32. Long short-term memory. Neural Comput, 9(8):1735–1780, 1997.
  33. Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078, 2014.
  34. Sequential deep operator networks (s-deeponet) for predicting full-field solutions under time-dependent loads. Engineering Applications of Artificial Intelligence, 127:107258, 2024. ISSN 0952-1976.
  35. Deepxde: A deep learning library for solving differential equations. SIAM review, 63(1):208–228, 2021b.
  36. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. URL https://www.tensorflow.org/. Software available from tensorflow.org.
  37. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  38. Cfd python: the 12 steps to navier-stokes equations. Journal of Open Source Education, 2(16):21, 2018.
  39. SIMULIA. Abaqus, 2020.
  40. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.
  41. Ahmed Fawzy Gad. Pygad: An intuitive genetic algorithm python library. arXiv preprint arXiv:2106.06158, 2021.
  42. Spatial transformer networks. Advances in neural information processing systems, 28, 2015.
  43. Attention is all you need. Advances in neural information processing systems, 30, 2017.
Citations (1)

Summary

We haven't generated a summary for this paper yet.