Beating full state tomography for unentangled spectrum estimation (2504.02785v1)
Abstract: How many copies of a mixed state $\rho \in \mathbb{C}{d \times d}$ are needed to learn its spectrum? To date, the best known algorithms for spectrum estimation require as many copies as full state tomography, suggesting the possibility that learning a state's spectrum might be as difficult as learning the entire state. We show that this is not the case in the setting of unentangled measurements, by giving a spectrum estimation algorithm that uses $n = O(d3\cdot (\log\log(d) / \log(d))4 )$ copies of $\rho$, which is asymptotically fewer than the $n = \Omega(d3)$ copies necessary for full state tomography. Our algorithm is inspired by the technique of local moment matching from classical statistics, and shows how it can be applied in the quantum setting. As an important subroutine in our spectrum estimation algorithm, we give an estimator of the $k$-th moment $\operatorname{tr}(\rhok)$ which performs unentangled measurements and uses $O(d{3-2/k})$ copies of $\rho$ in order to achieve a constant multiplicative error. This directly translates to an additive-error estimator of quantum Renyi entropy of order $k$ with the same number of copies. Finally, we present numerical evidence that the sample complexity of spectrum estimation can only improve over full state tomography by a sub-polynomial factor. Specifically, for spectrum learning with fully entangled measurements, we run simulations which suggest a lower bound of $\Omega(d{2 - \gamma})$ copies for any constant $\gamma > 0$. From this, we conclude the current best lower bound of $\Omega(d)$ is likely not tight.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.