Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Differentiable Learning of Quantum Circuit Born Machine (1804.04168v1)

Published 11 Apr 2018 in quant-ph, cs.LG, and stat.ML

Abstract: Quantum circuit Born machines are generative models which represent the probability distribution of classical dataset as quantum pure states. Computational complexity considerations of the quantum sampling problem suggest that the quantum circuits exhibit stronger expressibility compared to classical neural networks. One can efficiently draw samples from the quantum circuits via projective measurements on qubits. However, similar to the leading implicit generative models in deep learning, such as the generative adversarial networks, the quantum circuits cannot provide the likelihood of the generated samples, which poses a challenge to the training. We devise an efficient gradient-based learning algorithm for the quantum circuit Born machine by minimizing the kerneled maximum mean discrepancy loss. We simulated generative modeling of the Bars-and-Stripes dataset and Gaussian mixture distributions using deep quantum circuits. Our experiments show the importance of circuit depth and gradient-based optimization algorithm. The proposed learning algorithm is runnable on near-term quantum device and can exhibit quantum advantages for generative modeling.

Citations (215)

Summary

  • The paper introduces a gradient-based algorithm that minimizes the MMD loss to effectively train quantum circuit Born machines.
  • It demonstrates that deeper quantum circuits improve representational power on datasets like Bars-and-Stripes and Gaussian mixtures.
  • The research paves the way for scalable quantum-classical hybrid models that overcome limitations of classical generative approaches.

An Overview of Differentiable Learning of Quantum Circuit Born Machine

The paper "Differentiable Learning of Quantum Circuit Born Machine" contributes a methodical paper on the deployment and training of quantum circuit Born machines (QCBMs) as generative models. Essentially, QCBMs aim to represent the probabilistic distribution of classical datasets using quantum states, potentially offering a quantum advantage in terms of representational efficiency and expressivity compared to classical machine learning models, specifically neural networks.

Context and Motivation

Unsupervised generative modeling underpins numerous advances within artificial intelligence, with applications spanning computer vision, speech synthesis, and chemical design. Classical approaches such as GANs, Boltzmann machines, and variational autoencoders have set a high bar in terms of representation, learning, and sample generation. However, these models encounter limitations in efficiently representing high-dimensional probability distributions. As quantum computing hardware advances, quantum circuits may offer robust alternatives or extensions to classical generative models. Specifically, the research is driven by the motivation to leverage the quantum circuit's capability to efficiently sample from implicit distributions without direct access to the modeled probability.

Methodology and Key Contributions

The researchers devised a gradient-based learning algorithm tailored to QCBMs by minimizing the kerneled maximum mean discrepancy (MMD) loss. This loss serves as a metric for measuring the divergence between the probability distributions of samples generated by the quantum machine and real data samples. A core innovation lies in the construction of an unbiased estimator for the gradient, which is calculated using projective measurements on quantum circuits. This development opens practical avenues for training QCBMs with gradient-based approaches analogous to backpropagation in classical deep learning, although applied within the constraints of quantum computation frameworks.

The paper presents simulations on the Bars-and-Stripes and Gaussian mixture datasets, thereby demonstrating the practicality of the approach. Results highlight the significant impact of circuit depth and the effectiveness of the gradient-based optimization in reducing MMD loss, pointing to improved representational power with deeper circuits.

Implications and Future Prospects

The presented methodology outlines a scalable quantum-classical hybrid approach, making it suitable for near-term quantum devices. The framework potentially elucidates a path towards overcoming some challenges inherent in classical generative tasks, especially in modeling highly complex, high-dimensional spaces. One theoretical implication of the research is its confirmation of the robustness of the quantum circuit's representational power, even with pragmatic quantum circuits.

From a practical perspective, the research implicates several prospects for advancements in quantum machine learning. With further hardware improvements, one can anticipate more extensive experimentation on practical datasets, possibly leading to quantum-enhanced generative models that outperform their classical counterparts in specific tasks. Another speculative but promising direction rests in exploring other loss functions or adversarial setups, akin to GANs, which can further the robustness and generalization capabilities of QCBMs.

Overall, while the gradient vanishing problem mentioned remains a concern, the adaptability and depth-flexibility of the QCBM indicate that, with optimized structural designs or shared weight strategies, it can develop into an invaluable tool within the domain of quantum artificial intelligence and beyond.