Papers
Topics
Authors
Recent
2000 character limit reached

Tensor Field Networks

Updated 17 December 2025
  • Tensor Field Networks are neural frameworks that enforce rotation, translation, and permutation equivariance for accurate 3D point cloud analysis.
  • They employ spherical harmonics, radial functions, and Clebsch–Gordan coefficients to build equivariant layers for applications in geometry, physics, and chemistry.
  • Field tensor network states (fTNS) extend these concepts to infinite-dimensional spaces using conformal field theory correlators to model critical quantum systems with symmetry-protected phases.

Tensor Field Networks (TFNs) are neural network architectures designed to operate on 3D point clouds with built-in invariance or equivariance to rigid Euclidean transformations (rotations and translations in R3\mathbb{R}^3) and point permutations, while field tensor network states (fTNS) constitute a class of infinite-dimensional tensor networks constructed from conformal field theory (CFT) correlators, designed to capture the critical ground states and symmetry-protected properties of spin systems. Both frameworks highlight the role of symmetries in geometric or physical systems and provide principled methods to encode these constraints within either machine learning models or quantum many-body wavefunctions.

1. Mathematical Foundations and Symmetry Principles

TFNs and fTNS both explicitly encode symmetry principles in their construction.

In TFNs, the central requirement is equivariance under the groups of translations in R3\mathbb{R}^3, rotations (SO(3)SO(3)), and permutations of the point set:

  • Translational Equivariance: Tt{(rα,xα)}={(rα+t,xα)}T_t\{(r_\alpha, x_\alpha)\} = \{(r_\alpha + t, x_\alpha)\} with LTt=TtLL \circ T_t = T_t \circ L.
  • Rotational Equivariance: Under gSO(3)g \in SO(3), point positions transform as rαR(g)rαr_\alpha \to R(g)r_\alpha and features as xαDX(g)xαx_\alpha \to D^X(g) x_\alpha, with the equivariance condition L[R(g)DX(g)]=[R(g)DY(g)]LL \circ [R(g) \oplus D^X(g)] = [R(g) \oplus D^Y(g)] \circ L.
  • Permutation Equivariance: Permutation of points indexed by σ\sigma satisfies LPσ=PσLL \circ P_\sigma = P_\sigma \circ L.

fTNS, in contrast, replace the finite bond dimension of matrix product states (MPS) with an infinite-dimensional virtual space defined by a 1D quantum field—the Hilbert space of a 2D CFT. The NN-site spin chain wavefunction is

ψ=s1,,sN=1dcs1sNs1sN|\psi\rangle = \sum_{s_1,\dots,s_N=1}^d c_{s_1\cdots s_N}\, |s_1\cdots s_N\rangle

with coefficients given by functional integrals over boundary conditions for each virtual space segment:

cs1sN=[Df1][DfN]Af1,f2s1Af2,f3s2AfN,f1sNc_{s_1\cdots s_N} = \int[\mathcal{D}f_1]\cdots[\mathcal{D}f_N]\,\mathcal{A}^{s_1}_{f_1,f_2}\, \mathcal{A}^{s_2}_{f_2,f_3} \cdots \mathcal{A}^{s_N}_{f_N, f_1}

The fTNS construction encodes symmetries at the physical layer (e.g., SU(2) spin rotation) in terms of commutators with the corresponding CFT currents acting on the virtual degrees of freedom (Gasull et al., 2022).

2. Feature Representations and Equivariant Operations

In TFNs, every feature at point aa is represented as a direct sum of irreducible SO(3) representations (rotation orders) l=0,1,2,l=0,1,2,\dots:

  • Scalars: l=0l=0 (dimension 1)
  • Vectors: l=1l=1 (dimension 3)
  • Symmetric traceless rank-2 tensors: l=2l=2 (dimension 5), and so forth

Each rotation order ll is accompanied by nln_l channels, and features are indexed as Va,c,m(l)V^{(l)}_{a,c,m} with m=l,,lm=-l, \dotsc, l.

For fTNS, the indices of the tensor network move from discrete (as in conventional MPS) to virtual bonds specified by boundary functions fi(x)f_i(x) in the Hilbert space L2(R)K\mathbb{L}^2(\mathbb{R}) \cup \mathbb{K}, enabling the encoding of CFT symmetries and criticality.

3. Construction of Equivariant Layers and Functionals

The TFN convolutional structure employs filters constructed from the product of spherical harmonics and learnable radial profiles:

Wm(f,n)(r)=Rn(r)Ym(f)(r^)W^{(\ell_f, n)}_m(r) = R_n(\|r\|) Y^{(\ell_f)}_m(\hat{r})

Here, Ym()Y^{(\ell)}_m are real spherical harmonics and RnR_n are trainable radial functions. The convolution of a neighbor feature of order i\ell_i with a filter of order f\ell_f is performed via tensor products, with the output decomposed into all admissible orders o{if,,i+f}\ell_o \in \{|\ell_i - \ell_f|, \dotsc, \ell_i + \ell_f\} using Clebsch–Gordan coefficients:

(hi)o,mo=jN(i)f,mfi,minC(f,mf),(i,mi)(o,mo)hj,mi(i)Rn(rij)Ymf(f)(r^ij)(h'_i)_{\ell_o, m_o} = \sum_{j \in N(i)} \sum_{\ell_f, m_f} \sum_{\ell_i, m_i} \sum_n C^{(\ell_o, m_o)}_{(\ell_f, m_f), (\ell_i, m_i)} h^{(\ell_i)}_{j,m_i} R_n(\|r_{ij}\|) Y^{(\ell_f)}_{m_f}(\hat{r}_{ij})

This structure guarantees transformation under SO(3)SO(3) as geometric tensors of order o\ell_o (Thomas et al., 2018).

fTNS constructions use CFT correlators as local functional elements,

cs1sNχs1:eiαs1ϕ(z1):χsN:eiαsNϕ(zN):c_{s_1\cdots s_N} \propto \langle \chi_{s_1}:e^{i\sqrt{\alpha}s_1\phi(z_1)}: \cdots \chi_{s_N}:e^{i\sqrt{\alpha}s_N\phi(z_N)}: \rangle

which is mapped to a path integral over fields on a cylinder partitioned into NN strips, each with boundary data and possible vertex insertions.

4. Symmetry Action and Topological Classification

Symmetry actions in TFNs are directly embedded through the choice of irreducible representations and the constraint that every layer is equivariant to SO(3)SO(3), T(R3)T(\mathbb{R}^3), and permutations. Nonlinearities are designed to preserve equivariance, acting on channel indices and tensor norms.

In fTNS, physical-site symmetries (e.g., SU(2) generated by the Pauli matrices) correspond, on the virtual space, to affine-Lie (Kac-Moody) currents of the CFT:

H(z)=:iϕ(z):,E±(z)=:e±i2ϕ(z):H(z) = :i\partial\phi(z):, \quad E^\pm(z) = :e^{\pm i\sqrt{2}\phi(z)}:

with contour-integrated charges,

Qa=12πidzJa(z)Q^a = \frac{1}{2\pi i}\oint dz\, J^a(z)

The symmetry action on the functionals is specified by commutators with these charges. Projective representations in the virtual space, diagnosed via group commutators (e.g., VxVyVx1Vy1=1V_xV_yV_x^{-1}V_y^{-1}=-1 for SU(2)), assert the ability for fTNS to carry critical symmetry-protected topological (SPT) order analogous to the MPS classification via cocycles ωH2(G,U(1))\omega \in \mathcal{H}^2(G, U(1)) (Gasull et al., 2022).

5. Implementation and Complexity

TFNs are implemented as deep architectures with convolutional modules of the form: convolution \rightarrow self-interaction \rightarrow nonlinearity. For practical applications, feature dimensions typically use orders up to =2\ell=2 or 3. Radial profiles RnR_n are parameterized through small multilayer perceptrons over a fixed Gaussian-basis expansion of rr. Computational complexity is O(n×N(i)×(2+1)2)O(\sum_\ell n_\ell \times |N(i)| \times (2\ell+1)^2) per neighbor, with neighborhood pruning for scalability. GPU-accelerated implementations enable operation on point sets of thousands of points (Thomas et al., 2018).

6. Applications and Empirical Results

TFNs have demonstrated strong empirical performance in multiple domains:

  • Geometry: In a 3D-Tetris shape classification task, TFNs achieved 100% accuracy under random rotations and translations, requiring no rotational augmentation, while standard non-equivariant networks failed to generalize.
  • Physics: TFNs learned Newtonian inverse-square law (vector outputs of =1\ell=1) and the moment of inertia tensor (outputs at =02\ell=0 \oplus 2) from data, matching analytic decompositions.
  • Chemistry: On missing-atom prediction in quantum chemistry datasets, TFNs exhibited atom-type and position accuracy exceeding 91%, generalizing to molecules larger than those seen in training, with a mean absolute error of 0.14A˚0.14\,\text{Å} and per-element accuracy for C and H above 94%.

For fTNS, the method provides an analytic framework to derive the critical SPT properties of quantum spin chain ground states, with explicit application to the Majumdar–Ghosh model. In this system, the two dimerized ground states correspond to SPT classes differing by their representations of SU(2) symmetry in the virtual space; the distinction is diagnosed by evaluating the group commutator, demonstrating the fundamental SPT structure present in critical, CFT-constructed tensor networks (Gasull et al., 2022).

7. Generalizations and Broader Context

TFNs provide a template for constructing neural network architectures respecting geometric symmetries of Euclidean space, relevant for any learning task on 3D point clouds or molecular graphs where physical invariance is desired (Thomas et al., 2018).

fTNS generalizes the MPS formalism to critical systems where the virtual space is continuous/infinite-dimensional, supporting both critical behavior and symmetry/topological distinctions inaccessible to finite-bond-dimension tensor networks. Any CFT with local action and suitable primary fields admits an associated fTNS, with the SPT classification directly encoded by the projective properties of the CFT's symmetry currents. The formalism enables the identification and interpolation between trivial and nontrivial SPT phases via control over the spatial configuration of vertex operator insertions in the CFT correlator, offering a systematic route to classifying 1D critical SPT states (Gasull et al., 2022).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (2)

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Tensor Field Network.