Papers
Topics
Authors
Recent
Search
2000 character limit reached

Neural tangent kernel analysis of PINN for advection-diffusion equation

Published 21 Nov 2022 in physics.comp-ph and stat.ML | (2211.11716v1)

Abstract: Physics-informed neural networks (PINNs) numerically approximate the solution of a partial differential equation (PDE) by incorporating the residual of the PDE along with its initial/boundary conditions into the loss function. In spite of their partial success, PINNs are known to struggle even in simple cases where the closed-form analytical solution is available. In order to better understand the learning mechanism of PINNs, this work focuses on a systematic analysis of PINNs for the linear advection-diffusion equation (LAD) using the Neural Tangent Kernel (NTK) theory. Thanks to the NTK analysis, the effects of the advection speed/diffusion parameter on the training dynamics of PINNs are studied and clarified. We show that the training difficulty of PINNs is a result of 1) the so-called spectral bias, which leads to difficulty in learning high-frequency behaviours; and 2) convergence rate disparity between different loss components that results in training failure. The latter occurs even in the cases where the solution of the underlying PDE does not exhibit high-frequency behaviour. Furthermore, we observe that this training difficulty manifests itself, to some extent, differently in advection-dominated and diffusion-dominated regimes. Different strategies to address these issues are also discussed. In particular, it is demonstrated that periodic activation functions can be used to partly resolve the spectral bias issue.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.