Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enhancing Generative Models via Quantum Correlations (2101.08354v1)

Published 20 Jan 2021 in quant-ph, cond-mat.stat-mech, cs.LG, and stat.ML

Abstract: Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning. Quantum mechanical systems can produce probability distributions that exhibit quantum correlations which are difficult to capture using classical models. We show theoretically that such quantum correlations provide a powerful resource for generative modeling. In particular, we provide an unconditional proof of separation in expressive power between a class of widely-used generative models, known as Bayesian networks, and its minimal quantum extension. We show that this expressivity advantage is associated with quantum nonlocality and quantum contextuality. Furthermore, we numerically test this separation on standard machine learning data sets and show that it holds for practical problems. The possibility of quantum advantage demonstrated in this work not only sheds light on the design of useful quantum machine learning protocols but also provides inspiration to draw on ideas from quantum foundations to improve purely classical algorithms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xun Gao (42 papers)
  2. Eric R. Anschuetz (23 papers)
  3. Sheng-Tao Wang (38 papers)
  4. J. Ignacio Cirac (227 papers)
  5. Mikhail D. Lukin (242 papers)
Citations (67)

Summary

We haven't generated a summary for this paper yet.