2000 character limit reached
Loop optimization for tensor network renormalization (1512.04938v2)
Published 15 Dec 2015 in cond-mat.str-el, cond-mat.stat-mech, and quant-ph
Abstract: We introduce a tensor renormalization group scheme for coarse-graining a two-dimensional tensor network that can be successfully applied to both classical and quantum systems on and off criticality. The key innovation in our scheme is to deform a 2D tensor network into small loops and then optimize the tensors on each loop. In this way, we remove short-range entanglement at each iteration step and significantly improve the accuracy and stability of the renormalization flow. We demonstrate our algorithm in the classical Ising model and a frustrated 2D quantum model.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.