$ζ$-QVAE: A Quantum Variational Autoencoder utilizing Regularized Mixed-state Latent Representations (2402.17749v3)
Abstract: A major challenge in quantum computing is its application to large real-world datasets due to scarce quantum hardware resources. One approach to enabling tractable quantum models for such datasets involves finding low-dimensional representations that preserve essential information for downstream analysis. In classical machine learning, variational autoencoders (VAEs) facilitate efficient data compression, representation learning for subsequent tasks, and novel data generation. However, no quantum model has been proposed that captures these features for direct application to quantum data on quantum computers. Some existing quantum models for data compression lack regularization of latent representations. Others are hybrid models with only some internal quantum components, impeding direct training on quantum data. To address this, we present a fully quantum framework, $\zeta$-QVAE, which encompasses all the capabilities of classical VAEs and can be directly applied to map both classical and quantum data to a lower-dimensional space, while effectively reconstructing much of the original state from it. Our model utilizes regularized mixed states to attain optimal latent representations. It accommodates various divergences for reconstruction and regularization. Furthermore, by accommodating mixed states at every stage, it can utilize the full-data density matrix and allow for a training objective defined on probabilistic mixtures of input data. Doing so, in turn, makes efficient optimization possible and has potential implications for private and federated learning. In addition to exploring the theoretical properties of $\zeta$-QVAE, we demonstrate its performance on genomics and synthetic data. Our results indicate that $\zeta$-QVAE learns representations that better utilize the capacity of the latent space and exhibits similar or better performance compared to matched classical models.
- “beta-vae: Learning basic visual concepts with a constrained variational framework” In International conference on learning representations, 2016
- “Wasserstein auto-encoders” In arXiv preprint arXiv:1711.01558, 2017
- Olaf Ronneberger, Philipp Fischer and Thomas Brox “U-net: Convolutional networks for biomedical image segmentation” In Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18, 2015, pp. 234–241 Springer
- Jonathan Ho, Ajay Jain and Pieter Abbeel “Denoising diffusion probabilistic models” In Advances in neural information processing systems 33, 2020, pp. 6840–6851
- Jonathan Romero, Jonathan P Olson and Alan Aspuru-Guzik “Quantum autoencoders for efficient compression of quantum data” In Quantum Science and Technology 2.4 IOP Publishing, 2017, pp. 045001
- “On compression rate of quantum autoencoders: Control design, numerical and experimental realization” In Automatica 147, 2023, pp. 110659
- “Quantum neural network autoencoder and classifier applied to an industrial case study” In Quantum Machine Intelligence 4.2 Springer ScienceBusiness Media LLC, 2022
- Carlos Bravo-Prieto “Quantum autoencoders with enhanced data encoding” In Machine Learning: Science and Technology 2.3 IOP Publishing, 2021, pp. 035028
- Pablo Rivas, Liang Zhao and Javier Orduz “Hybrid Quantum Variational Autoencoders for Representation Learning” In 2021 International Conference on Computational Science and Computational Intelligence (CSCI) IEEE, 2021
- “Quantum variational autoencoder” In Quantum Science and Technology 4.1 IOP Publishing, 2018, pp. 014001
- Richard Jozsa “Fidelity for Mixed Quantum States” In Journal of Modern Optics 41.12 Taylor & Francis, 1994, pp. 2315–2323
- Hsin-Yuan Huang, Richard Kueng and John Preskill “Predicting many properties of a quantum system from very few measurements” In Nature Physics 16.10 Springer ScienceBusiness Media LLC, 2020, pp. 1050–1057
- V. Vedral “The role of relative entropy in quantum information theory” In Rev. Mod. Phys. 74 American Physical Society, 2002, pp. 197–234
- “Efficient optimization of the quantum relative entropy” In Journal of Physics A: Mathematical and Theoretical 51.15 IOP Publishing, 2018, pp. 154003
- A.P. Majtey, P.W. Lamberti and D.P. Prato “Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states” In Physical Review A 72.5 American Physical Society (APS), 2005
- “Quantum Wasserstein generative adversarial networks” In Advances in Neural Information Processing Systems 32, 2019
- Diederik P Kingma and Max Welling “Auto-encoding variational bayes” In arXiv preprint arXiv:1312.6114, 2013
- “Quantum embeddings for machine learning”, 2020 arXiv:2001.03622 [quant-ph]
- “Comprehensive functional genomic resource and integrative model for the human brain” In Science 362.6420 American Association for the Advancement of Science, 2018, pp. eaat8464
- Stephen Marsland “Machine Learning: An Algorithmic Perspective” CRC Press, 2014
- “Representation Theory for Geometric Quantum Machine Learning”, 2023 arXiv:2210.07980 [quant-ph]
- “Effect of barren plateaus on gradient-free optimization” In Quantum 5 Verein zur Forderung des Open Access Publizierens in den Quantenwissenschaften, 2021, pp. 558
- “A divide-and-conquer algorithm for quantum state preparation” In Scientific Reports 11.6329 Nature Publishing Group, 2021
- “Efficient quantum state tomography” In Nature Communications 1.149 Nature Publishing Group, 2010
- “Neural-network quantum state tomography” In Nature Physics 14 Nature Publishing Group, 2018, pp. 447–450
- “Reconstructing quantum states with generative models” In Nature Machine Intelligence 1 Nature Publishing Group, 2019, pp. 155–161
- “Machine learning with quantum computers” Springer, 2021
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.