Papers
Topics
Authors
Recent
2000 character limit reached

Sample-based training of quantum generative models (2511.11802v1)

Published 14 Nov 2025 in quant-ph

Abstract: Quantum computers can efficiently sample from probability distributions that are believed to be classically intractable, providing a foundation for quantum generative modeling. However, practical training of such models remains challenging, as gradient evaluation via the parameter-shift rule scales linearly with the number of parameters and requires repeated expectation-value estimation under finite-shot noise. We introduce a training framework that extends the principle of contrastive divergence to quantum models. By deriving the circuit structure and providing a general recipe for constructing it, we obtain quantum circuits that generate the samples required for parameter updates, yielding constant scaling with respect to the cost of a forward pass, analogous to backpropagation in classical neural networks. Numerical results demonstrate that it attains comparable accuracy to likelihood-based optimization while requiring substantially fewer samples. The framework thereby establishes a scalable route to training expressive quantum generative models directly on quantum hardware.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 2 tweets and received 3 likes.

Upgrade to Pro to view all of the tweets about this paper: