Efficient Closest Matrix Product State Learning in Logarithmic Depth (2510.07798v1)
Abstract: Learning the closest matrix product state (MPS) representation of a quantum state is known to enable useful tools for prediction and analysis of complex quantum systems. In this work, we study the problem of learning MPS in following setting: given many copies of an input MPS, the task is to recover a classical description of the state. The best known polynomial-time algorithm, introduced by [LCLP10, CPF+10], requires linear circuit depth and $O(n5)$ samples, and has seen no improvement in over a decade. The strongest known lower bound is only $\Omega(n)$. The combination of linear depth and high sample complexity renders existing algorithms impractical for near-term or even early fault-tolerant quantum devices. We show a new efficient MPS learning algorithm that runs in $O(\log n)$ depth and has sample complexity $O(n3)$. Also, we can generalize our algorithm to learn closest MPS state, in which the input state is not guaranteed to be close to the MPS with a fixed bond dimension. Our algorithms also improve both sample complexity and circuit depth of previous known algorithm.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.