Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Group Equivariant Fourier Neural Operators for Partial Differential Equations (2306.05697v2)

Published 9 Jun 2023 in cs.LG, cs.NA, and math.NA

Abstract: We consider solving partial differential equations (PDEs) with Fourier neural operators (FNOs), which operate in the frequency domain. Since the laws of physics do not depend on the coordinate system used to describe them, it is desirable to encode such symmetries in the neural operator architecture for better performance and easier learning. While encoding symmetries in the physical domain using group theory has been studied extensively, how to capture symmetries in the frequency domain is under-explored. In this work, we extend group convolutions to the frequency domain and design Fourier layers that are equivariant to rotations, translations, and reflections by leveraging the equivariance property of the Fourier transform. The resulting $G$-FNO architecture generalizes well across input resolutions and performs well in settings with varying levels of symmetry. Our code is publicly available as part of the AIRS library (https://github.com/divelab/AIRS).

Citations (26)

Summary

  • The paper presents a novel G-FNO architecture that extends group equivariant convolutions into the frequency domain for solving PDEs.
  • It integrates Fourier transforms with group convolutions to capture rotations, translations, and reflections, improving performance on benchmark PDE datasets.
  • Experimental results showcase lower test error rates and superior super-resolution compared to traditional FNOs and equivariant U-Nets.

Group Equivariant Fourier Neural Operators for Partial Differential Equations

The paper presents a novel approach to solving partial differential equations (PDEs) using Fourier neural operators (FNOs) that incorporate group equivariance. The focus is on designing neural architectures that can leverage symmetries inherent in physical laws to improve performance and learning efficiency. While significant progress has been made in encoding symmetries in the physical domain via group theory, the frequency domain remains under-explored. This work extends group convolutions to the frequency domain, creating Fourier layers that are equivariant to rotations, translations, and reflections. The proposed architecture, labeled GG-FNO, demonstrates strong generalization across input resolutions and varying symmetry settings.

Background and Motivation

PDEs are ubiquitous in modeling spatiotemporal phenomena across disciplines such as fluid dynamics, heat transfer, and electromagnetism. Traditional solvers emphasize either pointwise solutions or solution maps between function spaces. Neural operators have shown promise in efficiently learning these solution maps, particularly when the same problem recurs with slightly changed parameters or initial conditions. They diverge from physics-informed neural networks (PINNs) by focusing on offline learning from data, enabling rapid online inference.

PDEs naturally encompass symmetries, independent of the coordinate systems used, an observation leveraged to enhance neural operator architectures. Although the incorporation of symmetry in network architectures, like PINNs, has been studied, the application of these concepts within the frequency domain remains limited. FNOs stand out due to their ability to perform global convolutions efficiently in the frequency domain via the Fast Fourier Transform (FFT) and are capable of generalizing across data discretizations, offering potential advantages in zero-shot super-resolution tasks.

Methodology

The paper introduces GG-FNO, which extends group equivariant convolutions into the frequency domain. The foundation lies within the equivariance properties of the Fourier Transform under orthogonal transformations from O(2)O(2), ensuring that symmetries applied in the spatial domain are preserved in the frequency domain.

Concretely, the group equivariant Fourier layers utilize the Fourier transform to achieve equivariance to elements of p4p4 (roto-translations) and p4mp4m (roto-reflections and translations). This is accomplished by parameterizing convolution kernels in a Fourier-transformed group space, enabling efficient GG-convolutions interpreted through the lens of the Convolution Theorem.

The resultant GG-FNO architecture incorporates these Fourier layers alongside an encoder and decoder that handle the lifting of functions to higher-dimensional group spaces and subsequently back to base space, maintaining equivariance and enabling performance benefits.

Experimental Results and Analysis

The proposed GG-FNO was evaluated on both symmetric and non-symmetric PDE datasets, including variations of the Navier-Stokes equations and shallow water equations. Across these experiments, GG-FNO consistently demonstrated lower test error rates, highlighting its efficacy in settings both with and without explicit global symmetries. Notably, the architecture outperformed both traditional FNOs with data augmentation and non-Fourier-based models like equivariant U-Nets, showcasing its robustness in various symmetry scenarios.

The paper further demonstrates the superior super-resolution capability of GG-FNO compared to interpolation methods, a critical advantage in practical scenarios involving high-resolution predictions. It also emphasizes that GG-FNO retains its performance fidelity when subjected to rotations aligning with encoded symmetries, underscoring its capacity for equivariance.

Implications and Future Directions

The integration of group equivariance into FNOs offers a compelling advancement in the neural operator domain, with implications reaching beyond fluid dynamics to any PDE-constrained system characterized by symmetry. The work suggests a pathway for developing AI models that naturally align with physical laws by embedding symmetry properties directly into their architectures.

Future research may explore extending these concepts to continuous groups and improving architectures using steerable filters within the frequency domain. This exploration could potentially enhance both the interpretability and efficiency of neural solvers in complex, high-dimensional systems.

In summary, this paper marks a significant step forward in leveraging symmetries within the frequency domain for neural PDE solvers, setting a foundation for more efficient and accurate models across scientific domains.

Github Logo Streamline Icon: https://streamlinehq.com