Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning in Sinusoidal Spaces with Physics-Informed Neural Networks (2109.09338v2)

Published 20 Sep 2021 in cs.LG, cs.AI, cs.CE, and physics.comp-ph

Abstract: A physics-informed neural network (PINN) uses physics-augmented loss functions, e.g., incorporating the residual term from governing partial differential equations (PDEs), to ensure its output is consistent with fundamental physics laws. However, it turns out to be difficult to train an accurate PINN model for many problems in practice. In this paper, we present a novel perspective of the merits of learning in sinusoidal spaces with PINNs. By analyzing behavior at model initialization, we first show that a PINN of increasing expressiveness induces an initial bias around flat output functions. Notably, this initial solution can be very close to satisfying many physics PDEs, i.e., falling into a local minimum of the PINN loss that only minimizes PDE residuals, while still being far from the true solution that jointly minimizes PDE residuals and the initial and/or boundary conditions. It is difficult for gradient descent optimization to escape from such a local minimum trap, often causing the training to stall. We then prove that the sinusoidal mapping of inputs, in an architecture we label as sf-PINN, is effective to increase input gradient variability, thus avoiding being trapped in such deceptive local minimum. The level of variability can be effectively modulated to match high-frequency patterns in the problem at hand. A key facet of this paper is the comprehensive empirical study that demonstrates the efficacy of learning in sinusoidal spaces with PINNs for a wide range of forward and inverse modelling problems spanning multiple physics domains.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jian Cheng Wong (15 papers)
  2. Chinchun Ooi (5 papers)
  3. Abhishek Gupta (226 papers)
  4. Yew-Soon Ong (105 papers)
Citations (68)

Summary

We haven't generated a summary for this paper yet.