Papers
Topics
Authors
Recent
2000 character limit reached

AdS-GNN -- a Conformally Equivariant Graph Neural Network (2505.12880v1)

Published 19 May 2025 in cs.LG, cs.AI, and hep-th

Abstract: Conformal symmetries, i.e.\ coordinate transformations that preserve angles, play a key role in many fields, including physics, mathematics, computer vision and (geometric) machine learning. Here we build a neural network that is equivariant under general conformal transformations. To achieve this, we lift data from flat Euclidean space to Anti de Sitter (AdS) space. This allows us to exploit a known correspondence between conformal transformations of flat space and isometric transformations on the AdS space. We then build upon the fact that such isometric transformations have been extensively studied on general geometries in the geometric deep learning literature. We employ message-passing layers conditioned on the proper distance, yielding a computationally efficient framework. We validate our model on tasks from computer vision and statistical physics, demonstrating strong performance, improved generalization capacities, and the ability to extract conformal data such as scaling dimensions from the trained network.

Summary

  • The paper introduces AdS-GNN, a novel architecture that achieves conformal equivariance by mapping data to Anti de Sitter space.
  • It employs a unique lifting procedure and distance-conditioned message passing to handle translations, rotations, scaling, and special conformal transformations.
  • Empirical results on computer vision and statistical physics tasks demonstrate improved generalization and recovery of theoretical conformal dimensions.

The paper "AdS-GNN -- a Conformally Equivariant Graph Neural Network" introduces a novel graph neural network architecture designed to be equivariant under the full group of conformal transformations, which includes translations, rotations, scalings, and special conformal transformations. The core idea is to leverage the known correspondence between conformal transformations in Euclidean space (Rd\mathbb{R}^d) and isometries (distance-preserving transformations) in Anti de Sitter ($\AdS_{d+1}$) space.

The proposed method, AdS-GNN, operates on point cloud data initially residing in Rd\mathbb{R}^d. The key steps are:

  1. Lifting Data to Anti de Sitter Space: Point cloud data {xi}Rd\{x_i\} \subset \mathbb{R}^d is lifted to points $\{X_i = (x_i, z_i)\} \subset \AdS_{d+1}$. This lifting is not a simple embedding at z=0z=0 (the boundary), which would cause issues with the $\AdS$ metric singularity. Instead, an embedding procedure is used where an initial small value z0z_0 is used, and then a refined ziz_i for each point is calculated based on the $\AdS$ center of mass of its kliftk_{\mathrm{lift}} nearest neighbors in the initial embedding. This effectively assigns a scale (ziz_i) to each point based on its local density. For input features hiinputh^{\mathrm{input}}_i associated with xix_i and interpreted as a conformal field with dimension Δ\Delta, the lifted feature hiliftedh^{\mathrm{lifted}}_i is computed as hilifted=z^iΔhiinputh^{\mathrm{lifted}}_i = \hat{z}_i^{\Delta} h^{\mathrm{input}}_i, where z^i\hat{z}_i is the calculated zz-coordinate. For scalar features like image pixel values, Δ=0\Delta=0, and the features are lifted directly.
  2. Message Passing in AdS Space: The neural network is a graph neural network operating on the lifted points {Xi}\{X_i\} in $\AdS_{d+1}$ and their latent features {hi}\{h_i\}. The graph connectivity can be predefined or induced using kconk_{\text{con}} nearest neighbors based on the $\AdS$ proper distance. The message passing mechanism is similar to Euclidean Graph Neural Networks (like EGNN), but crucially, the message function ψe\psi_e between nodes ii and jj is conditioned on the $\AdS$ proper distance D(Xi,Xj)D(X_i, X_j): mij=ψe(hil,hjl,D(Xi,Xj))\mathbf{m}_{ij} = \psi_e(\mathbf{h}_i^l, \mathbf{h}_j^l, D(X_i,X_j)). The $\AdS$ proper distance between two points X=(x,z)X=(x,z) and X=(x,z)X'=(x',z') is given by: coshD(X,X)=z2+z2+xx22zz\cosh D(X, X') = \frac{z^2 + z'^2 + \|x - x'\|^2}{2 z z'}. Conditioning messages on the $\AdS$ proper distance ensures that the GNN layers themselves are exactly equivariant under isometries of $\AdS_{d+1}$.
  3. Output Handling: The output features from the final layer hilfinalh^{l_{\mathrm{final}}}_i are processed based on the task.
    • For invariant tasks (e.g., classification), global pooling (like summation) is applied to the node features.
    • For regression tasks predicting a conformal field on the boundary, the output ϕ(xi)\phi(x_i) is computed by applying the inverse scaling to the invariant node feature: ϕ(xi)=z^iΔhilfinal\phi(x_i) = \hat{z}_{i}^{-\Delta} h^{l_{\mathrm{final}}}_{i}. Here, Δ\Delta is the conformal dimension of the predicted field and can be a trainable parameter, allowing the network to learn the correct scaling behavior.

The authors validate AdS-GNN on tasks from computer vision and statistical physics.

  • Computer Vision: On the SuperPixel MNIST dataset, AdS-GNN performs comparably to roto-equivariant GNNs on non-augmented data but shows significantly improved generalization under scaling and special conformal transformations, empirically demonstrating its conformal equivariance benefits. On a shape segmentation task, it outperforms EGNN, especially in low-data regimes, suggesting better adaptation to multi-scale structures, though it doesn't incorporate orientation information.
  • Statistical Physics: The model is applied to predicting NN-point correlation functions in the 2D Ising model at its critical point, a system known to exhibit conformal invariance. Trained using ground truth correlation functions, AdS-GNN demonstrates superior performance compared to EGNN and a non-equivariant baseline across various system sizes (NN). A notable finding is the model's ability to learn the conformal dimensions Δ\Delta associated with the energy and spin operators as trainable parameters, recovering values very close to the theoretically known universal values (1 and 1/8 respectively). The model also shows strong generalization capacity to spatial configurations outside the training range and across different system sizes (NN).

While the GNN layers are exactly $\AdS$ isometric (and thus conformally equivariant on the lifted data), the initial lifting procedure using the center of mass approximation is noted as a limitation, as it mildly breaks exact special conformal transformation equivariance, although experiments show this breaking is small. Another limitation is the current restriction to scalar features.

Overall, AdS-GNN presents a practical approach to building conformally equivariant neural networks by leveraging the geometry of Anti de Sitter space, demonstrating strong performance and interpretability, particularly in physical systems governed by conformal symmetry.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 3 tweets with 269 likes about this paper.