Learning solution operators of PDEs defined on varying domains via MIONet (2402.15097v2)
Abstract: In this work, we propose a method to learn the solution operators of PDEs defined on varying domains via MIONet, and theoretically justify this method. We first extend the approximation theory of MIONet to further deal with metric spaces, establishing that MIONet can approximate mappings with multiple inputs in metric spaces. Subsequently, we construct a set consisting of some appropriate regions and provide a metric on this set thus make it a metric space, which satisfies the approximation condition of MIONet. Building upon the theoretical foundation, we are able to learn the solution mapping of a PDE with all the parameters varying, including the parameters of the differential operator, the right-hand side term, the boundary condition, as well as the domain. Without loss of generality, we for example perform the experiments for 2-d Poisson equations, where the domains and the right-hand side terms are varying. The results provide insights into the performance of this method across convex polygons, polar regions with smooth boundary, and predictions for different levels of discretization on one task. We also show the additional result of the fully-parameterized case in the appendix for interested readers. Reasonably, we point out that this is a meshless method, hence can be flexibly used as a general solver for a type of PDE.
- Physics-informed neural networks (PINNs) for fluid mechanics: A review. Acta Mechanica Sinica, 37(12):1727–1738, 2021.
- J. Dugundji. An extension of Tietze’s theorem. Pacific Journal of Mathematics, 1(3):353–367, 1951.
- Barron spaces and the compositional function spaces for neural network models. arXiv preprint arXiv:1906.08039, 2019.
- W. E and B. Yu. The deep Ritz method: a deep learning-based numerical algorithm for solving variational problems. Communications in Mathematics and Statistics, 6(1):1–12, 2018.
- Deep transfer operator learning for partial differential equations under conditional shift. Nature Machine Intelligence, 4(12):1155–1164, 2022.
- Multiwavelet-based operator learning for differential equations. Advances in neural information processing systems, 34:24048–24062, 2021.
- B. Hanin. Universal function approximation by deep neural nets with bounded width and relu activations. Mathematics, 7(10):992, 2019.
- MgNO: Efficient Parameterization of Linear Operators via Multigrid. arXiv preprint arXiv:2310.19809, 2023.
- Multilayer feedforward networks are universal approximators. Neural networks, 2(5):359–366, 1989.
- Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural networks, 3(5):551–560, 1990.
- J. Hu and P. Jin. Experimental observation on a low-rank tensor model for eigenvalue problems. arXiv preprint arXiv:2302.00538, 2023.
- J. Hu and P. Jin. A hybrid iterative method based on MIONet for PDEs: Theory and numerical examples. arXiv preprint arXiv:2402.07156, 2024.
- MIONet: Learning multiple-input operators via tensor product. SIAM Journal on Scientific Computing, 44(6):A3490–A3514, 2022.
- Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
- D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
- Fourier neural operator with learned deformations for pdes on general geometries. arXiv preprint arXiv:2207.05209, 2022.
- Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
- Neural operator: Graph kernel network for partial differential equations. arXiv preprint arXiv:2003.03485, 2020.
- DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. arXiv preprint arXiv:1910.03193, 2019.
- Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature machine intelligence, 3(3):218–229, 2021.
- Physics-informed neural networks with hard constraints for inverse design. SIAM Journal on Scientific Computing, 43(6):B1105–B1132, 2021.
- fPINNs: Fractional physics-informed neural networks. SIAM Journal on Scientific Computing, 41(4):A2603–A2626, 2019.
- U-no: U-shaped neural operators. arXiv preprint arXiv:2204.11127, 2022.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
- Convolutional neural operators. In ICLR 2023 Workshop on Physics for Machine Learning, 2023.
- J. W. Siegel and J. Xu. High-order approximation rates for shallow neural networks with cosine and ReLUksuperscriptReLU𝑘{\rm ReLU}^{k}roman_ReLU start_POSTSUPERSCRIPT italic_k end_POSTSUPERSCRIPT activation functions. Applied and Computational Harmonic Analysis, 58:1–26, 2022.
- J. Sirignano and K. Spiliopoulos. DGM: A deep learning algorithm for solving partial differential equations. Journal of computational physics, 375:1339–1364, 2018.
- Learning the solution operator of parametric partial differential equations with physics-informed DeepONets. Science advances, 7(40):eabi8605, 2021.
- Tensor neural network and its numerical integration. arXiv preprint arXiv:2207.02754, 2022.
- Solving High-Dimensional PDEs with Latent Spectral Models. arXiv preprint arXiv:2301.12664, 2023.
- State-space modeling for electrochemical performance of Li-ion batteries with physics-informed deep operator networks. Journal of Energy Storage, 73:109244, 2023.