Learning Circuits with Infinite Tensor Networks (2506.02105v1)
Abstract: Hamiltonian simulation on quantum computers is strongly constrained by gate counts, motivating techniques to reduce circuit depths. While tensor networks are natural competitors to quantum computers, we instead leverage them to support circuit design, with datasets of tensor networks enabling a unitary synthesis inspired by quantum machine learning. For a target simulation in the thermodynamic limit, translation invariance is exploited to significantly reduce the optimization complexity, avoiding a scaling with system size. Our approach finds circuits to efficiently prepare ground states, and perform time evolution on both infinite and finite systems with substantially lower gate depths than conventional Trotterized methods. In addition to reducing CNOT depths, we motivate similar utility for fault-tolerant quantum algorithms, with a demonstrated $5.2\times$ reduction in $T$-count to realize $e{-iHt}$. The key output of our approach is the optimized unit-cell of a translation invariant circuit. This provides an advantage for Hamiltonian simulation of finite, yet arbitrarily large, systems on real quantum computers.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.