Physics-enhanced deep surrogates for partial differential equations (2111.05841v4)
Abstract: Many physics and engineering applications demand Partial Differential Equations (PDE) property evaluations that are traditionally computed with resource-intensive high-fidelity numerical solvers. Data-driven surrogate models provide an efficient alternative but come with a significant cost of training. Emerging applications would benefit from surrogates with an improved accuracy-cost tradeoff, while studied at scale. Here we present a "physics-enhanced deep-surrogate" ("PEDS") approach towards developing fast surrogate models for complex physical systems, which is described by PDEs. Specifically, a combination of a low-fidelity, explainable physics simulator and a neural network generator is proposed, which is trained end-to-end to globally match the output of an expensive high-fidelity numerical solver. Experiments on three exemplar testcases, diffusion, reaction-diffusion, and electromagnetic scattering models, show that a PEDS surrogate can be up to 3$\times$ more accurate than an ensemble of feedforward neural networks with limited data ($\approx 103$ training points), and reduces the training data need by at least a factor of 100 to achieve a target error of 5%. Experiments reveal that PEDS provides a general, data-driven strategy to bridge the gap between a vast array of simplified physical models with corresponding brute-force numerical solvers modeling complex systems, offering accuracy, speed, data efficiency, as well as physical insights into the process.
- Active learning of deep surrogates for pdes: application to metasurface design. npj Computational Materials, 6(1):1–7, 2020.
- Space mapping. IEEE Microwave Magazine, 9(6):105–122, 2008.
- Neural space-mapping optimization for em-based design. IEEE Transactions on Microwave Theory and Techniques, 48(12):2307–2315, 2000.
- Coarse-and fine-mesh space mapping for em optimization incorporating mesh deformation. IEEE Microwave and Wireless Components Letters, 29(8):510–512, 2019.
- A novel dynamic neuro-space mapping approach for nonlinear microwave device modeling. IEEE Microwave and Wireless Components Letters, 26(2):131–133, 2016.
- Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, 15(1):1929–1958, 2014.
- Characterizing metasurfaces/metafilms: the connection between surface susceptibilities and effective material properties. IEEE Antennas Wirel. Propag. Lett., 10:1507–1511, 2011.
- Fundamental limitations of polynomial chaos for uncertainty quantification in systems with intermittent instabilities. Communications in mathematical sciences, 11(1):55–103, 2013.
- A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients. arXiv preprint arXiv:1809.07321, 2018.
- Multilayer feedforward networks are universal approximators. Neural networks, 2(5):359–366, 1989.
- Vugar E Ismailov. A three layer neural network can represent any multivariate function. Journal of Mathematical Analysis and Applications, 523(1):127096, 2023.
- Universal approximation with deep narrow networks. In Conference on learning theory, pages 2306–2327. PMLR, 2020.
- Kernel and rich regimes in overparametrized models. In Conference on Learning Theory, pages 3635–3673. PMLR, 2020.
- Raphaël Pestourie (15 papers)
- Youssef Mroueh (66 papers)
- Chris Rackauckas (23 papers)
- Payel Das (104 papers)
- Steven G. Johnson (126 papers)