Sample-based training of quantum generative models (2511.11802v1)
Abstract: Quantum computers can efficiently sample from probability distributions that are believed to be classically intractable, providing a foundation for quantum generative modeling. However, practical training of such models remains challenging, as gradient evaluation via the parameter-shift rule scales linearly with the number of parameters and requires repeated expectation-value estimation under finite-shot noise. We introduce a training framework that extends the principle of contrastive divergence to quantum models. By deriving the circuit structure and providing a general recipe for constructing it, we obtain quantum circuits that generate the samples required for parameter updates, yielding constant scaling with respect to the cost of a forward pass, analogous to backpropagation in classical neural networks. Numerical results demonstrate that it attains comparable accuracy to likelihood-based optimization while requiring substantially fewer samples. The framework thereby establishes a scalable route to training expressive quantum generative models directly on quantum hardware.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.