Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Positive bias makes tensor-network contraction tractable (2410.05414v1)

Published 7 Oct 2024 in quant-ph, cs.CC, and cs.DS

Abstract: Tensor network contraction is a powerful computational tool in quantum many-body physics, quantum information and quantum chemistry. The complexity of contracting a tensor network is thought to mainly depend on its entanglement properties, as reflected by the Schmidt rank across bipartite cuts. Here, we study how the complexity of tensor-network contraction depends on a different notion of quantumness, namely, the sign structure of its entries. We tackle this question rigorously by investigating the complexity of contracting tensor networks whose entries have a positive bias. We show that for intermediate bond dimension d>~n, a small positive mean value >~1/d of the tensor entries already dramatically decreases the computational complexity of approximately contracting random tensor networks, enabling a quasi-polynomial time algorithm for arbitrary 1/poly(n) multiplicative approximation. At the same time exactly contracting such tensor networks remains #P-hard, like for the zero-mean case [HHEG20]. The mean value 1/d matches the phase transition point observed in [CJHS24]. Our proof makes use of Barvinok's method for approximate counting and the technique of mapping random instances to statistical mechanical models. We further consider the worst-case complexity of approximate contraction of positive tensor networks, where all entries are non-negative. We first give a simple proof showing that a multiplicative approximation with error exponentially close to one is at least StoqMA-hard. We then show that when considering additive error in the matrix 1-norm, the contraction of positive tensor network is BPP-Complete. This result compares to Arad and Landau's [AL10] result, which shows that for general tensor networks, approximate contraction up to matrix 2-norm additive error is BQP-Complete.

Citations (1)

Summary

  • The paper demonstrates that a positive mean in tensor entries reduces contraction complexity, allowing a quasi-polynomial time approximation for 2D tensor networks when the bond dimension meets d ≳ n.
  • It presents a quasi-polynomial time algorithm that approximates contraction values within any arbitrary 1/poly(n) multiplicative error for random tensor networks with a positive bias.
  • The research maps tensor networks to statistical mechanical models, pinpointing a phase transition at μ ≳ 1/d that offers new insights for quantum many-body simulations.

Overview of the Paper: Positive Bias Makes Tensor-Network Contraction Tractable

In the paper "Positive Bias Makes Tensor-Network Contraction Tractable," the authors address the computational complexity of tensor-network contraction, a critical tool in quantum many-body physics, quantum information, and quantum chemistry. While the complexity is often linked to the entanglement properties of tensor networks, this research explores the impact of the sign structure of tensor entries on the complexity of contraction. The paper specifically investigates whether a positive bias in the tensor entries can make contraction more tractable.

Key Contributions

  1. Analyzing Positive Bias: The paper primarily demonstrates that a positive mean in tensor entries significantly reduces the computational complexity of tensor-network contraction. It provides evidence for this by proving that an algorithm can achieve a quasi-polynomial time approximation for 2D tensor networks with a positive bias when the bond dimension dnd \gtrsim n.
  2. Quasi-Polynomial Time Algorithm: For random tensor networks with entries having a positive mean (μ1/d\mu \gtrsim 1/d), the authors design a quasi-polynomial time algorithm capable of approximating contraction values within any arbitrary 1/poly(n)1/\text{poly}(n) multiplicative error with high probability.
  3. Phase Transition Point: The threshold μ1/d\mu \gtrsim 1/d aligns with a previously conjectured phase transition point, suggesting a fundamental change in contraction complexity at this mean value.
  4. Exact Contraction Complexity: The paper confirms that the exact contraction of such tensor networks remains #\#-hard, similar to zero-mean cases, proving that positivity eases approximation but not exact computation.
  5. Mapping to Statistical Models: The authors employ a mapping from random tensor networks to statistical mechanical models to underpin their theoretical analyses, specifically mapping to a 2D Ising model. This approach allows them to rigorously quantify the impact of a positive bias on computational complexity.
  6. Complexity of Positive Tensor Networks: For positive tensor networks where all entries are non-negative, the paper shows that multiplicative approximation remains StoqMA\text{StoqMA}-hard, even with an error exponentially close to one. However, when considered with additive error in matrix 1-norm, the problem becomes BPP\text{BPP}, indicating classical tractability.

Implications and Future Directions

The findings of this paper advance understanding in both theoretical and practical aspects of tensor-network computations. The existence of a quasi-polynomial time approximation for positively biased tensor networks may inspire new algorithms or enhance existing heuristics used in quantum many-body simulations.

The identification of the transition point μ1/d\mu \gtrsim 1/d highlights potential areas for further theoretical exploration, particularly examining the sign structure's influence on computational efficiency beyond current models. In practical terms, these insights could lead to more effective contraction algorithms in contexts where positivity in tensor entries is naturally present or can be engineered.

Conclusion

This research reveals how even slight adjustments in tensor sign structure can lead to tractable computational problems, thus broadening the set of efficiently contractible tensor networks. The interdisciplinary approach combining quantum information theory and statistical physics provides a robust framework for addressing longstanding challenges in tensor-network contraction, paving the way for future advancements in quantum simulations and beyond.

X Twitter Logo Streamline Icon: https://streamlinehq.com