Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning the solution operator of parametric partial differential equations with physics-informed DeepOnets (2103.10974v1)

Published 19 Mar 2021 in cs.LG, cs.NA, math.NA, and stat.ML

Abstract: Deep operator networks (DeepONets) are receiving increased attention thanks to their demonstrated capability to approximate nonlinear operators between infinite-dimensional Banach spaces. However, despite their remarkable early promise, they typically require large training data-sets consisting of paired input-output observations which may be expensive to obtain, while their predictions may not be consistent with the underlying physical principles that generated the observed data. In this work, we propose a novel model class coined as physics-informed DeepONets, which introduces an effective regularization mechanism for biasing the outputs of DeepOnet models towards ensuring physical consistency. This is accomplished by leveraging automatic differentiation to impose the underlying physical laws via soft penalty constraints during model training. We demonstrate that this simple, yet remarkably effective extension can not only yield a significant improvement in the predictive accuracy of DeepOnets, but also greatly reduce the need for large training data-sets. To this end, a remarkable observation is that physics-informed DeepONets are capable of solving parametric partial differential equations (PDEs) without any paired input-output observations, except for a set of given initial or boundary conditions. We illustrate the effectiveness of the proposed framework through a series of comprehensive numerical studies across various types of PDEs. Strikingly, a trained physics informed DeepOnet model can predict the solution of $\mathcal{O}(103)$ time-dependent PDEs in a fraction of a second -- up to three orders of magnitude faster compared a conventional PDE solver. The data and code accompanying this manuscript are publicly available at \url{https://github.com/PredictiveIntelligenceLab/Physics-informed-DeepONets}.

Citations (573)

Summary

  • The paper introduces a novel physics-informed DeepONet framework that integrates a physics-based regularization to reduce dependency on extensive training data.
  • It demonstrates significant accuracy improvements in numerical experiments, including an 80% boost in solving diffusion-reaction equations.
  • The approach bridges data-driven and physics-constrained modeling, offering efficient and cost-effective simulations across various scientific domains.

Physics-Informed DeepONets for Solving Parametric PDEs

Overview

The paper introduces an advanced machine learning approach, physics-informed Deep Operator Networks (DeepONets), to enhance the process of learning solution operators for parametric partial differential equations (PDEs). Traditional DeepONets, while promising, require large training datasets and sometimes fail to align predictions with underlying physical laws. This research aims to address these limitations by integrating principles from physics-informed neural networks (PINNs).

Theoretical Contributions

Physics-informed DeepONets extend the standard DeepONet framework by incorporating a physics-based regularization mechanism during training. This approach biases model outputs to adhere to the governing PDEs even in the absence of extensive training data. The practical advantage here is notable: it reduces the reliance on paired input-output data by using known physical laws as constraints.

Methodology

The methodology builds on the ability of neural networks to approximate nonlinear operators through a differentiable architecture. By implementing automatic differentiation, the authors penalize model outputs during training to satisfy specified PDE residuals, initial, and boundary conditions. This dual focus on data and physics creates a robust training regime enhancing predictive accuracy and generalization.

Numerical Results

The paper discusses multiple numerical experiments across different types of parametric PDEs:

  1. Anti-Derivative Operator: Demonstrated that physics-informed DeepONets, without requiring any explicit output data, can learn the anti-derivative operator with significantly lower prediction errors compared to conventional approaches.
  2. Diffusion-Reaction Systems: Showcases that when predicting solutions for diffusion-reaction equations with source terms, physics-informed DeepONets achieve superior accuracy (80% improvement) without paired input-output data.
  3. Burgers' Equation: The methodology was tested on the nonlinear Burgers’ equation, highlighting that large performance gains can be realized compared to classical deep learning approaches. The operator predicts solutions with remarkable speed, reducing computational costs significantly.
  4. Eikonal Equation: In problems involving two-dimensional Eikonal equations, the proposed technique was effective with both simple (circle) and complex (airfoil) boundary input cases, consistently approximating the true signed distance functions.

Implications and Future Work

The introduction of physics-informed DeepONets represents a significant advancement in the application of neural networks for operator learning. This paper emphasizes the potential for reducing training dataset sizes while ensuring consistency with physical laws, posing implications for numerous fields such as computational biology, engineering design, and environmental modeling.

Future work could explore optimal network architectures and feature embeddings tailored for specific PDEs, as suggested by the promising results with Fourier feature networks. Another direction involves optimizing the loss function weights to further enhance training efficiency and prediction robustness. Addressing these will not only refine the performance but also extend applicability to more complex, multi-scale, and multi-physics systems.

In summary, this research presents a compelling framework that bridges the gap between data-driven and physics-constrained modeling in PDE-driven applications, paving the way for more efficient and accurate simulations across scientific domains.

Github Logo Streamline Icon: https://streamlinehq.com