Papers
Topics
Authors
Recent
2000 character limit reached

Continuous-variable Quantum Boltzmann Machine (2405.06580v1)

Published 10 May 2024 in quant-ph

Abstract: We propose a continuous-variable quantum Boltzmann machine (CVQBM) using a powerful energy-based neural network. It can be realized experimentally on a continuous-variable (CV) photonic quantum computer. We used a CV quantum imaginary time evolution (QITE) algorithm to prepare the essential thermal state and then designed the CVQBM to proficiently generate continuous probability distributions. We applied our method to both classical and quantum data. Using real-world classical data, such as synthetic aperture radar (SAR) images, we generated probability distributions. For quantum data, we used the output of CV quantum circuits. We obtained high fidelity and low Kuller-Leibler (KL) divergence showing that our CVQBM learns distributions from given data well and generates data sampling from that distribution efficiently. We also discussed the experimental feasibility of our proposed CVQBM. Our method can be applied to a wide range of real-world problems by choosing an appropriate target distribution (corresponding to, e.g., SAR images, medical images, and risk management in finance). Moreover, our CVQBM is versatile and could be programmed to perform tasks beyond generation, such as anomaly detection.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.