Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 59 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 430 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

$ζ$-QVAE: A Quantum Variational Autoencoder utilizing Regularized Mixed-state Latent Representations (2402.17749v3)

Published 27 Feb 2024 in quant-ph

Abstract: A major challenge in quantum computing is its application to large real-world datasets due to scarce quantum hardware resources. One approach to enabling tractable quantum models for such datasets involves finding low-dimensional representations that preserve essential information for downstream analysis. In classical machine learning, variational autoencoders (VAEs) facilitate efficient data compression, representation learning for subsequent tasks, and novel data generation. However, no quantum model has been proposed that captures these features for direct application to quantum data on quantum computers. Some existing quantum models for data compression lack regularization of latent representations. Others are hybrid models with only some internal quantum components, impeding direct training on quantum data. To address this, we present a fully quantum framework, $\zeta$-QVAE, which encompasses all the capabilities of classical VAEs and can be directly applied to map both classical and quantum data to a lower-dimensional space, while effectively reconstructing much of the original state from it. Our model utilizes regularized mixed states to attain optimal latent representations. It accommodates various divergences for reconstruction and regularization. Furthermore, by accommodating mixed states at every stage, it can utilize the full-data density matrix and allow for a training objective defined on probabilistic mixtures of input data. Doing so, in turn, makes efficient optimization possible and has potential implications for private and federated learning. In addition to exploring the theoretical properties of $\zeta$-QVAE, we demonstrate its performance on genomics and synthetic data. Our results indicate that $\zeta$-QVAE learns representations that better utilize the capacity of the latent space and exhibits similar or better performance compared to matched classical models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. “beta-vae: Learning basic visual concepts with a constrained variational framework” In International conference on learning representations, 2016
  2. “Wasserstein auto-encoders” In arXiv preprint arXiv:1711.01558, 2017
  3. Olaf Ronneberger, Philipp Fischer and Thomas Brox “U-net: Convolutional networks for biomedical image segmentation” In Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18, 2015, pp. 234–241 Springer
  4. Jonathan Ho, Ajay Jain and Pieter Abbeel “Denoising diffusion probabilistic models” In Advances in neural information processing systems 33, 2020, pp. 6840–6851
  5. Jonathan Romero, Jonathan P Olson and Alan Aspuru-Guzik “Quantum autoencoders for efficient compression of quantum data” In Quantum Science and Technology 2.4 IOP Publishing, 2017, pp. 045001
  6. “On compression rate of quantum autoencoders: Control design, numerical and experimental realization” In Automatica 147, 2023, pp. 110659
  7. “Quantum neural network autoencoder and classifier applied to an industrial case study” In Quantum Machine Intelligence 4.2 Springer ScienceBusiness Media LLC, 2022
  8. Carlos Bravo-Prieto “Quantum autoencoders with enhanced data encoding” In Machine Learning: Science and Technology 2.3 IOP Publishing, 2021, pp. 035028
  9. Pablo Rivas, Liang Zhao and Javier Orduz “Hybrid Quantum Variational Autoencoders for Representation Learning” In 2021 International Conference on Computational Science and Computational Intelligence (CSCI) IEEE, 2021
  10. “Quantum variational autoencoder” In Quantum Science and Technology 4.1 IOP Publishing, 2018, pp. 014001
  11. Richard Jozsa “Fidelity for Mixed Quantum States” In Journal of Modern Optics 41.12 Taylor & Francis, 1994, pp. 2315–2323
  12. Hsin-Yuan Huang, Richard Kueng and John Preskill “Predicting many properties of a quantum system from very few measurements” In Nature Physics 16.10 Springer ScienceBusiness Media LLC, 2020, pp. 1050–1057
  13. V. Vedral “The role of relative entropy in quantum information theory” In Rev. Mod. Phys. 74 American Physical Society, 2002, pp. 197–234
  14. “Efficient optimization of the quantum relative entropy” In Journal of Physics A: Mathematical and Theoretical 51.15 IOP Publishing, 2018, pp. 154003
  15. A.P. Majtey, P.W. Lamberti and D.P. Prato “Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states” In Physical Review A 72.5 American Physical Society (APS), 2005
  16. “Quantum Wasserstein generative adversarial networks” In Advances in Neural Information Processing Systems 32, 2019
  17. Diederik P Kingma and Max Welling “Auto-encoding variational bayes” In arXiv preprint arXiv:1312.6114, 2013
  18. “Quantum embeddings for machine learning”, 2020 arXiv:2001.03622 [quant-ph]
  19. “Comprehensive functional genomic resource and integrative model for the human brain” In Science 362.6420 American Association for the Advancement of Science, 2018, pp. eaat8464
  20. Stephen Marsland “Machine Learning: An Algorithmic Perspective” CRC Press, 2014
  21. “Representation Theory for Geometric Quantum Machine Learning”, 2023 arXiv:2210.07980 [quant-ph]
  22. “Effect of barren plateaus on gradient-free optimization” In Quantum 5 Verein zur Forderung des Open Access Publizierens in den Quantenwissenschaften, 2021, pp. 558
  23. “A divide-and-conquer algorithm for quantum state preparation” In Scientific Reports 11.6329 Nature Publishing Group, 2021
  24. “Efficient quantum state tomography” In Nature Communications 1.149 Nature Publishing Group, 2010
  25. “Neural-network quantum state tomography” In Nature Physics 14 Nature Publishing Group, 2018, pp. 447–450
  26. “Reconstructing quantum states with generative models” In Nature Machine Intelligence 1 Nature Publishing Group, 2019, pp. 155–161
  27. “Machine learning with quantum computers” Springer, 2021

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 3 tweets and received 5 likes.

Upgrade to Pro to view all of the tweets about this paper:

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube