Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multilevel domain decomposition-based architectures for physics-informed neural networks (2306.05486v3)

Published 8 Jun 2023 in math.NA and cs.NA

Abstract: Physics-informed neural networks (PINNs) are a powerful approach for solving problems involving differential equations, yet they often struggle to solve problems with high frequency and/or multi-scale solutions. Finite basis physics-informed neural networks (FBPINNs) improve the performance of PINNs in this regime by combining them with an overlapping domain decomposition approach. In this work, FBPINNs are extended by adding multiple levels of domain decompositions to their solution ansatz, inspired by classical multilevel Schwarz domain decomposition methods (DDMs). Analogous to typical tests for classical DDMs, we assess how the accuracy of PINNs, FBPINNs and multilevel FBPINNs scale with respect to computational effort and solution complexity by carrying out strong and weak scaling tests. Our numerical results show that the proposed multilevel FBPINNs consistently and significantly outperform PINNs across a range of problems with high frequency and multi-scale solutions. Furthermore, as expected in classical DDMs, we show that multilevel FBPINNs improve the accuracy of FBPINNs when using large numbers of subdomains by aiding global communication between subdomains.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. doi:10.2172/1478744. URL http://www.osti.gov/servlets/purl/1478744/
  2. arXiv:2003.04919, doi:10.1145/3514228. URL https://dl.acm.org/doi/10.1145/3514228
  3. arXiv:2201.05624, doi:10.1007/s10915-022-01939-z. URL https://link.springer.com/article/10.1007/s10915-022-01939-z
  4. doi:10.1017/s0962492919000059.
  5. doi:10.13039/501100000266.
  6. arXiv:9705023, doi:10.1109/72.712178.
  7. doi:10.1016/j.jcp.2018.10.045. URL https://doi.org/10.1016/j.jcp.2018.10.045
  8. arXiv:2006.11894. URL http://arxiv.org/abs/2006.11894
  9. arXiv:2003.06496, doi:10.1016/j.jcp.2020.109951.
  10. arXiv:2103.02807, doi:10.1017/jfm.2021.135. URL https://doi.org/10.1017/jfm.2021.135
  11. doi:10.1126/science.aaw4741.
  12. arXiv:1912.01085, doi:10.1364/oe.384875.
  13. doi:10.1016/j.jcp.2020.109913.
  14. arXiv:2103.10974, doi:10.1126/sciadv.abi8605. URL https://www.science.org/doi/full/10.1126/sciadv.abi8605
  15. arXiv:1901.06314, doi:10.1016/j.jcp.2019.05.024. URL http://arxiv.org/abs/1901.06314http://dx.doi.org/10.1016/j.jcp.2019.05.024
  16. arXiv:2005.03448, doi:10.1038/s41467-021-26434-1. URL https://www.nature.com/articles/s41467-021-26434-1
  17. arXiv:2006.16144, doi:10.1093/imanum/drab093. URL https://academic.oup.com/imajna/advance-article/doi/10.1093/imanum/drab093/6503953
  18. arXiv:2004.01806, doi:10.4208/cicp.oa-2020-0193. URL http://arxiv.org/abs/2004.01806
  19. arXiv:2007.14527, doi:10.1016/j.jcp.2021.110768.
  20. arXiv:2107.07871. URL https://arxiv.org/abs/2107.07871v1http://arxiv.org/abs/2107.07871
  21. arXiv:2012.10047, doi:10.1016/j.cma.2021.113938.
  22. arXiv:1901.06523, doi:10.4208/CICP.OA-2020-0085.
  23. arXiv:1806.08734. URL http://arxiv.org/abs/1806.08734
  24. arXiv:1906.00425. URL http://arxiv.org/abs/1906.00425
  25. arXiv:1912.01198. URL http://arxiv.org/abs/1912.01198
  26. arXiv:2006.10739, doi:10.48550/arxiv.2006.10739. URL https://arxiv.org/abs/2006.10739v1
  27. doi:10.4208/aamm.OA-2021-0305.
  28. doi:10.48550/ARXIV.2112.03732.
  29. doi:10.4208/CICP.OA-2020-0164.
  30. arXiv:2211.08939.
  31. doi:10.48550/arXiv.1412.6980.
  32. doi:10.1007/BF01589116.
  33. doi:10.2514/8.5282.
  34. doi:10.48550/arXiv.1603.04467.
  35. arXiv:1906.02382, doi:10.1016/j.cma.2019.112732. URL http://arxiv.org/abs/1906.02382http://dx.doi.org/10.1016/j.cma.2019.112732
  36. doi:10.3390/make2010004. URL https://europepmc.org/articles/PMC7259480https://europepmc.org/article/pmc/pmc7259480
  37. doi:10.1007/b137868.
  38. doi:10.48550/ARXIV.2211.05560.
  39. doi:10.1137/S106482759732678X.
  40. doi:10.1137/S106482750037620X.
  41. doi:10.1137/15M102887X.
  42. arXiv:2006.09661. URL http://arxiv.org/abs/2006.09661
  43. doi:10.1038/s41592-019-0686-2.
Citations (27)

Summary

  • The paper extends FBPINNs by incorporating a multilevel domain decomposition strategy to overcome challenges in high-frequency, multi-scale differential equations.
  • The paper demonstrates that multilevel FBPINNs outperform traditional PINNs and one-level FBPINNs in scalability and accuracy through rigorous strong and weak scaling tests.
  • The study bridges classical Schwarz methods with neural network solvers, offering promising applications in fluid dynamics, geophysics, and other complex systems.

Multilevel Domain Decomposition-Based Architectures for Physics-Informed Neural Networks

The paper presents an extension to the finite basis physics-informed neural networks (FBPINNs), focusing on enhancing their ability to solve high-frequency and multi-scale differential equations. Physics-informed neural networks (PINNs) have shown promise in solving differential equations, leveraging neural networks to approximate solutions directly. However, their efficacy decreases with high-frequency or multi-scale solution spaces. This paper builds upon FBPINNs by integrating multilevel domain decomposition, inspired by classical Schwarz methods, to address the limitations seen in traditional PINNs.

Core Methodology

At the heart of this work is the incorporation of a multilevel approach into the FBPINNs. Traditional FBPINNs utilize an overlapping domain decomposition, employing multiple smaller neural networks, each tasked with a subsection of the solution space. The multilevel strategy introduces several overlapping domain decomposition levels, improving information transfer between subdomains and enhancing the scalability of FBPINNs. This approach aims to improve the accuracy of FBPINNs while maintaining computational efficiency, particularly when handling large numbers of subdomains.

Numerical Results and Performance

The paper evaluates the performance of the proposed multilevel FBPINN architecture through a series of scaling tests on differential equations, including the homogeneous Laplacian, multi-scale Laplacian, and Helmholtz problems in two dimensions. The results demonstrate that multilevel FBPINNs consistently outperform both traditional PINNs and one-level FBPINNs. One critical aspect highlighted is the ability of multilevel FBPINNs to effectively scale with an increased complexity in problem size and solution space, maintaining their performance across varying conditions.

In the strong scaling tests, where model complexity increased while keeping the problem size fixed, these networks showed better accuracy per computational effort. Meanwhile, in weak scaling tests, where the problem complexity and model capacity scaled proportionally, multilevel FBPINNs showed nearly consistent performance despite the heightened problem demands. This performance is noteworthy because it suggests that multilevel structures enable FBPINNs to handle irregular domain shapes and effectively solve high-frequency problems, surpassing traditional neural-based approaches.

Implications and Future Directions

The integration of multilevel domain decomposition with FBPINNs holds significant implications in scientific computing and complex system modeling, offering a more robust framework for solving differential equations with multi-scale properties. By borrowing concepts from domain decomposition methods, such as those in classical numerical solvers, the multilevel FBPINNs potentially bridge the gap between classical numerical analysis and modern machine learning techniques.

Practical implications include potential applications in fields like fluid dynamics, geophysics, and any domain where differential equations with high-frequency components are prevalent. The paper suggests that further investigation into optimal domain decompositions and network architectures could yield even more efficient models. Moreover, incorporating learning-based decomposition strategies or adaptive window functions could enhance versatility and applicability to a broader range of problem types.

Conclusion

This paper represents a substantial development in the use of neural networks for solving differential equations, presenting multilevel FBPINNs as a viable and effective method for increasing the accuracy and efficiency of these tasks. By leveraging multilevel decomposition strategies, this approach offers a promising direction for future research and application in scientific machine learning, addressing current limitations in handling complex, multi-scale problems.