Existence and efficient findability of belief-propagation fixed points for tensor networks

Determine precise classes of tensor networks, such as projected entangled pair states (PEPS) on arbitrary graphs, for which belief-propagation message-passing equations admit fixed points and establish algorithms with rigorous convergence guarantees to efficiently find such fixed points; in particular, ascertain conditions under which fixed points exist and message-passing converges for tensor-network contraction.

Background

The cluster and cluster–cumulant expansions developed in the paper provide rigorous error guarantees only when expanded around a suitable belief-propagation (BP) fixed point that exhibits loop decay. In practice, however, tensor networks can possess multiple BP fixed points, and standard message-passing may converge to an inappropriate stable solution (e.g., the symmetry-broken solution in a confusion regime) or fail to converge.

The authors emphasize that expanding around a poor fixed point invalidates the guarantees, highlighting a foundational algorithmic gap: when do fixed points exist for a given tensor network, and when can they be efficiently and reliably found by message-passing with provable convergence? This gap motivates a formal characterization of existence and convergence conditions for BP on tensor networks.

References

In general, the fixed-point problem remains open: it is not known for what classes of TNs fixed point exists, and whether they can be found efficiently (e.g., guarantees on convergence of message passing on TNs).

Belief Propagation and Tensor Network Expansions for Many-Body Quantum Systems: Rigorous Results and Fundamental Limits  (2604.03228 - Midha et al., 3 Apr 2026) in Section: Discussions (final paragraphs)