Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 97 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 25 tok/s
GPT-5 High 31 tok/s Pro
GPT-4o 112 tok/s
GPT OSS 120B 460 tok/s Pro
Kimi K2 211 tok/s Pro
2000 character limit reached

Fine-Grained Tensor Network Methods (1911.04882v3)

Published 12 Nov 2019 in cond-mat.str-el, hep-lat, hep-th, and quant-ph

Abstract: We develop a strategy for tensor network algorithms that allows to deal very efficiently with lattices of high connectivity. The basic idea is to fine-grain the physical degrees of freedom, i.e., decompose them into more fundamental units which, after a suitable coarse-graining, provide the original ones. Thanks to this procedure, the original lattice with high connectivity is transformed by an isometry into a simpler structure, which is easier to simulate via usual tensor network methods. In particular this enables the use of standard schemes to contract infinite 2d tensor networks - such as Corner Transfer Matrix Renormalization schemes - which are more involved on complex lattice structures. We prove the validity of our approach by numerically computing the ground-state properties of the ferromagnetic spin-1 transverse-field Ising model on the 2d triangular and 3d stacked triangular lattice, as well as of the hard-core and soft-core Bose-Hubbard models on the triangular lattice. Our results are benchmarked against those obtained with other techniques, such as perturbative continuous unitary transformations and graph projected entangled pair states, showing excellent agreement and also improved performance in several regimes.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.