Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Meta-Auto-Decoder for Solving Parametric Partial Differential Equations (2111.08823v3)

Published 15 Nov 2021 in cs.LG, cs.AI, and physics.comp-ph

Abstract: Many important problems in science and engineering require solving the so-called parametric partial differential equations (PDEs), i.e., PDEs with different physical parameters, boundary conditions, shapes of computation domains, etc. Recently, building learning-based numerical solvers for parametric PDEs has become an emerging new field. One category of methods such as the Deep Galerkin Method (DGM) and Physics-Informed Neural Networks (PINNs) aim to approximate the solution of the PDEs. They are typically unsupervised and mesh-free, but require going through the time-consuming network training process from scratch for each set of parameters of the PDE. Another category of methods such as Fourier Neural Operator (FNO) and Deep Operator Network (DeepONet) try to approximate the solution mapping directly. Being fast with only one forward inference for each PDE parameter without retraining, they often require a large corpus of paired input-output observations drawn from numerical simulations, and most of them need a predefined mesh as well. In this paper, we propose Meta-Auto-Decoder (MAD), a mesh-free and unsupervised deep learning method that enables the pre-trained model to be quickly adapted to equation instances by implicitly encoding (possibly heterogenous) PDE parameters as latent vectors. The proposed method MAD can be interpreted by manifold learning in infinite-dimensional spaces, granting it a geometric insight. Extensive numerical experiments show that the MAD method exhibits faster convergence speed without losing accuracy than other deep learning-based methods. The project page with code is available: https://gitee.com/mindspore/mindscience/tree/master/MindElec/.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (14)
  1. Xiang Huang (49 papers)
  2. Zhanhong Ye (6 papers)
  3. Hongsheng Liu (30 papers)
  4. Beiji Shi (2 papers)
  5. Zidong Wang (48 papers)
  6. Kang Yang (69 papers)
  7. Yang Li (1142 papers)
  8. Bingya Weng (2 papers)
  9. Min Wang (233 papers)
  10. Haotian Chu (4 papers)
  11. Fan Yu (63 papers)
  12. Bei Hua (5 papers)
  13. Lei Chen (485 papers)
  14. Bin Dong (111 papers)
Citations (28)

Summary

We haven't generated a summary for this paper yet.