Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Binary structured physics-informed neural networks for solving equations with rapidly changing solutions (2401.12806v2)

Published 23 Jan 2024 in cs.LG and cs.AI

Abstract: Physics-informed neural networks (PINNs), rooted in deep learning, have emerged as a promising approach for solving partial differential equations (PDEs). By embedding the physical information described by PDEs into feedforward neural networks, PINNs are trained as surrogate models to approximate solutions without the need for label data. Nevertheless, even though PINNs have shown remarkable performance, they can face difficulties, especially when dealing with equations featuring rapidly changing solutions. These difficulties encompass slow convergence, susceptibility to becoming trapped in local minima, and reduced solution accuracy. To address these issues, we propose a binary structured physics-informed neural network (BsPINN) framework, which employs binary structured neural network (BsNN) as the neural network component. By leveraging a binary structure that reduces inter-neuron connections compared to fully connected neural networks, BsPINNs excel in capturing the local features of solutions more effectively and efficiently. These features are particularly crucial for learning the rapidly changing in the nature of solutions. In a series of numerical experiments solving Burgers equation, Euler equation, Helmholtz equation, and high-dimension Poisson equation, BsPINNs exhibit superior convergence speed and heightened accuracy compared to PINNs. From these experiments, we discover that BsPINNs resolve the issues caused by increased hidden layers in PINNs resulting in over-smoothing, and prevent the decline in accuracy due to non-smoothness of PDEs solutions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (46)
  1. O. A. Arqub. Numerical solutions for the robin time-fractional partial differential equations of heat and fluid flows based on the reproducing kernel algorithm. Int. J. Numer. Meth. Heat Fluid Flow., 28(4):828–856, 2018.
  2. G. Avalos and R. Triggiani. Rational decay rates for a pde heat-structure interaction: A frequency domain approach. Evol. Equ. Control Theor., 2(2):233–253, 2013.
  3. Workshop report on basic research needs for scientific machine learning: Core technologies for artificial intelligence. Technical report, USDOE Office of Science (SC), 2019.
  4. Automatic differentiation in machine learning: a survey. J. Mach. Learn. Res., 18(1):1–43, 2018.
  5. Numerical analysis. Cengage learning, Boston, ninth edition, 2015.
  6. Numerical methods for engineers. Mcgraw-hill, sixth edition, 2010.
  7. Physics-informed neural networks with adaptive localized artificial viscosity. J. Comput. Phys., 489:112265, 2023.
  8. J. Darbon and S. Osher. Algorithms for overcoming the curse of dimensionality for certain hamilton–jacobi equations arising in control theory and elsewhere. Res. Math. Sci., 3:19–44, 2016.
  9. I. Daubechies. Ten lectures on wavelets. SIAM, Philadelphia, second edition, 1992.
  10. M. Dissanayake and N. Phan-Thien. Neural-network-based approximations for solving partial differential equations. Commun. Numer. Meth. Eng., 10:195–201, 1994.
  11. A. Ern and J.-L. Guermond. Theory and practice of finite elements. Springer, New York, first edition, 2004.
  12. P. Escapil-Inchauspé and G. A. Ruz. Hyper-parameter tuning of physics-informed neural networks: Application to helmholtz problems. Neurocomputing., page 126826, 2023.
  13. Finite volume methods. Elsevier, Amsterdam, second edition, 2000.
  14. D. Gottlieb and C.-W. Shu. On the Gibbs phenomenon and its resolution. SIAM Rev., 39(4):644–668, 1997.
  15. A discontinuity capturing shallow neural network for elliptic interface problems. J. Comput. Phys., 469:111576, 2022.
  16. Adaptive mixtures of local experts. Neural Comput., 3:79–87, 1991.
  17. Adaptive activation functions accelerate convergence in deep and physics-informed neural networks. Journal of Computational Physics, 404:109136, 2020.
  18. Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems. Comput. Methods Appl. Mech. Eng., 365:113028, 2020.
  19. NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations. J. Comput. Phys., 426:109951, 2021.
  20. Adam: A method for stochastic optimization. In International Conference on Learning Representations, 2015.
  21. M. Köppen. The curse of dimensionality. In 5th online world conference on soft computing in industrial applications (WSC5), volume 1, pages 4–8, 2000.
  22. Reynolds averaged turbulence modelling using deep neural networks with embedded invariance. J. Fluid Mech., 807:155–166, 2016.
  23. Deepxde: A deep learning library for solving differential equations. SIAM Rev., 63(1):208–228, 2021.
  24. S. Mallat. A theory for multiresolution signal decomposition: the wavelet representation. IEEE Trans. Pattern Anal. Mach. Intell., 11(7):674–693, 1989.
  25. Physics-informed neural networks for high-speed flows. Comput. Methods Appl. Mech. Eng., 360:112789, 2020.
  26. L. McClenny and U. Braga-Neto. Self-adaptive physics-informed neural networks using a soft attention mechanism. In Proceedings of the AAAI-MLPS, 2021.
  27. NVIDIA Corporation. NVIDIA Modulus. https://docs.nvidia.com/deeplearning/modulus/modulus-v2209. Last updated on Apr 26, 2023.
  28. K. Oono and T. Suzuki. Graph neural networks exponentially lose expressive power for node classification. In International Conference on Learning Representations, 2020.
  29. E. J. Parish and K. Duraisamy. A paradigm for data-driven predictive modeling using field inversion and machine learning. J. Comput. Phys., 305:758–774, 2016.
  30. Physics-informed neural networks: A deep learningframework for solving forward and inverse problems involvingnonlinear partial differential equations. J. Comput. Phys., 378:686–707, 2019.
  31. S. I. Resnick. Heavy-tail phenomena: probabilistic and statistical modeling. Springer, New York, 2007.
  32. Continuous-time nonlinear signal processing: a neural network based approach for gray box identification. In Proceedings of IEEE Workshop on Neural Networks for Signal Processing, pages 596–605, 1994.
  33. Dropedge: Towards deep graph convolutional networks on node classification. In International Conference on Learning Representations, 2020.
  34. Y. Shen and G. Zha. Improvement of weighted essentially non-oscillatory schemes near discontinuities. Computers & Fluids, 96:1–9, 2014.
  35. Equalization loss for long-tailed object recognition. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 11662–11671, 2020.
  36. D. R. Tim and M. Siddhartha. Error analysis for physics-informed neural networks (PINNs) approximating Kolmogorov PDEs. Adv. Comput. Math., 48(79):1–40, 2022.
  37. L. N. Trefethen. Spectral methods in MATLAB. SIAM, Philadelphia, first edition, 2000.
  38. Fourier analysis and approximation of functions. Springer, New York, first edition, 2004.
  39. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks. Proc. R. Soc. A., 474(2213), 2018.
  40. Spline-pinn: Approaching PDEs without data using fast, physics-informed Hermite-spline CNNs. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 8529–8538, 2022.
  41. Understanding and mitigating gradient pathologies in physics-informed neural networks. SIAM J. Sci. Comput., 43:A3055–A3081, 2021.
  42. When and why pinns fail to train: A neural tangent kernel perspective. J. Comput. Phys., 449:110768, 2022.
  43. The computational future for climate and earth system models: on the path to petaflop and beyond. Philos. Trans. R. Soc. A., 367(1890):833–846, 2009.
  44. C. L. Wight and J. Zhao. Solving Allen-Cahn and Cahn-Hilliard equations using the adaptive physics informed neural networks. Commun. Comput. Phys., 29(3):930–954, 2020.
  45. Adaptive deep neural networks methods for high-dimensional partial differential equations. J. Comput. Phys., 463:111232, 2022.
  46. Dynamic modeling and simulation of integrated electricity and gas systems. IEEE Trans. Smart Grid., 14(2):1011–1026, 2022.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets