Efficient Tensor Network Algorithms for Spin Foam Models (2406.19676v1)
Abstract: Numerical computations and methods have become increasingly crucial in the study of spin foam models across various regimes. This paper adds to this field by introducing new algorithms based on tensor network methods for computing amplitudes, focusing on topological SU(2) BF and Lorentzian EPRL spin foam models. By reorganizing the sums and tensors involved, vertex amplitudes are recast as a sequence of matrix contractions. This reorganization significantly reduces computational complexity and memory usage, allowing for scalable and efficient computations of the amplitudes for larger representation labels on standard consumer hardware--previously infeasible due to the computational demands of high-valent tensors. We apply these tensor network algorithms to analyze the characteristics of various vertex configurations, including Regge and vector geometries for the SU(2) BF theory, demonstrating consistent scaling behavior and differing oscillation patterns. Our benchmarks reveal substantial improvements in computational time and memory allocations, especially for large representation labels. Additionally, these tensor network methods are applicable to generic 2-complexes with multiple vertices, where we introduce partial-coherent vertex amplitudes to streamline the computations. The implementation of these algorithms is available on GitHub for further exploration and use.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.