Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive variational ground state preparation for spin-1 models on qubit-based architectures (2310.03705v1)

Published 5 Oct 2023 in quant-ph and cond-mat.str-el

Abstract: We apply the adaptive variational quantum imaginary time evolution (AVQITE) method to prepare ground states of one-dimensional spin $S=1$ models. We compare different spin-to-qubit encodings (standard binary, Gray, unary, and multiplet) with regard to the performance and quantum resource cost of the algorithm. Using statevector simulations we study two well-known spin-1 models: the Blume-Capel model of transverse-field Ising spins with single-ion anisotropy, and the XXZ model with single-ion anisotropy. We consider system sizes of up to $20$ qubits, which corresponds to spin-$1$ chains up to length $10$. We determine the dependence of the number of CNOT gates in the AVQITE state preparation circuit on the encoding, the initial state, and the choice of operator pool in the adaptive method. Independent on the choice of encoding, we find that the CNOT gate count scales cubically with the number of spins for the Blume-Capel model and quartically for the anistropic XXZ model. However, the multiplet and Gray encodings present smaller prefactors in the scaling relations. These results provide useful insights for the implementation of AVQITE on quantum hardware.

Citations (1)

Summary

We haven't generated a summary for this paper yet.