Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 147 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 20 tok/s Pro
GPT-4o 90 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 424 tok/s Pro
Claude Sonnet 4.5 39 tok/s Pro
2000 character limit reached

Optimizing Tensor Contraction Paths: A Greedy Algorithm Approach With Improved Cost Functions (2405.09644v1)

Published 8 May 2024 in quant-ph, cs.DM, and cs.MS

Abstract: Finding efficient tensor contraction paths is essential for a wide range of problems, including model counting, quantum circuits, graph problems, and LLMs. There exist several approaches to find efficient paths, such as the greedy and random greedy algorithm by Optimized Einsum (opt_einsum), and the greedy algorithm and hypergraph partitioning approach employed in cotengra. However, these algorithms require a lot of computational time and resources to find efficient contraction paths. In this paper, we introduce a novel approach based on the greedy algorithm by opt_einsum that computes efficient contraction paths in less time. Moreover, with our approach, we are even able to compute paths for large problems where modern algorithms fail.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (11)
  1. Eli Meirom et al. Optimizing tensor network contraction using reinforcement learning. In International Conference on Machine Learning, 2022.
  2. Hyper-optimized tensor network contraction. Quantum, 2021.
  3. The tensor networks anthology: Simulation techniques for many-body quantum lattice systems. SciPost Physics Lecture Notes, 2019.
  4. Efficient contraction of large tensor networks for weighted model counting through graph decompositions. arXiv: 1908.04381, 2020.
  5. Faster identification of optimal contraction sequences for tensor networks. In Physical Review E, 2014.
  6. Algorithms for tensor network contraction ordering. Machine Learning: Science and Technology, 2020.
  7. Opt_einsum - a python package for optimizing contraction order for einsum-like expressions. Journal of Open Source Software, 2018.
  8. Optimized einsum. https://dgasmith.github.io/opt_einsum/. (retrieved: 07.02.2024).
  9. NumPy Developers. Numpy - einsum. https://numpy.org/doc/stable/reference/generated/numpy.einsum.html. (retrieved: 07.02.2024).
  10. Johnnie Gray. Cotengra. https://github.com/jcmgray/cotengra. (retrieved: 07.03.2024).
  11. High-quality hypergraph partitioning. ACM Journal of Experimental Algorithmics, 2022.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.