Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 52 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

A Variational Quantum Algorithm for Preparing Quantum Gibbs States (2002.00055v1)

Published 31 Jan 2020 in quant-ph

Abstract: Preparation of Gibbs distributions is an important task for quantum computation. It is a necessary first step in some types of quantum simulations and further is essential for quantum algorithms such as quantum Boltzmann training. Despite this, most methods for preparing thermal states are impractical to implement on near-term quantum computers because of the memory overheads required. Here we present a variational approach to preparing Gibbs states that is based on minimizing the free energy of a quantum system. The key insight that makes this practical is the use of Fourier series approximations to the logarithm that allows the entropy component of the free-energy to be estimated through a sequence of simpler measurements that can be combined together using classical post processing. We further show that this approach is efficient for generating high-temperature Gibbs states, within constant error, if the initial guess for the variational parameters for the programmable quantum circuit are sufficiently close to a global optima. Finally, we examine the procedure numerically and show the viability of our approach for five-qubit Hamiltonians using Trotterized adiabatic state preparation as an ansatz.

Citations (70)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.