Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantum Generative Diffusion Model: A Fully Quantum-Mechanical Model for Generating Quantum State Ensemble (2401.07039v4)

Published 13 Jan 2024 in quant-ph and cs.LG

Abstract: Classical diffusion models have shown superior generative results. Exploring them in the quantum domain can advance the field of quantum generative learning. This work introduces Quantum Generative Diffusion Model (QGDM) as their simple and elegant quantum counterpart. Through a non-unitary forward process, any target quantum state can be transformed into a completely mixed state that has the highest entropy and maximum uncertainty about the system. A trainable backward process is used to recover the former from the latter. The design requirements for its backward process includes non-unitarity and small parameter count. We introduce partial trace operations to enforce non-unitary and reduce the number of trainable parameters by using a parameter-sharing strategy and incorporating temporal information as an input in the backward process. We present QGDM's resource-efficient version to reduce auxiliary qubits while preserving generative capabilities. QGDM exhibits faster convergence than Quantum Generative Adversarial Network (QGAN) because its adopted convex-based optimization can result in better convergence. The results of comparing it with QGAN demonstrate its effectiveness in generating both pure and mixed quantum states. It can achieve 53.02% higher fidelity in mixed-state generation than QGAN. The results highlight its great potential to tackle challenging quantum generation tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (72)
  1. S. Bond-Taylor, A. Leach, Y. Long, and C. G. Willcocks, “Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models,” IEEE transactions on pattern analysis and machine intelligence, 2021.
  2. Y. Cao, S. Li, Y. Liu, Z. Yan, Y. Dai, P. S. Yu, and L. Sun, “A comprehensive survey of ai-generated content (aigc): A history of generative ai from gan to chatgpt,” arxiv:2303.04226, 2023.
  3. OpenAI, “Gpt-4 technical report,” https://cdn.openai.com/papers/gpt-4.pdf, 2023.
  4. R. Rombach, A. Blattmann, D. Lorenz, P. Esser, and B. Ommer, “High-resolution image synthesis with latent diffusion models,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 10 684–10 695.
  5. T. Wu, S. He, J. Liu, S. Sun, K. Liu, Q.-L. Han, and Y. Tang, “A brief overview of chatgpt: The history, status quo and potential future development,” IEEE/CAA Journal of Automatica Sinica, vol. 10, no. 5, pp. 1122–1136, 2023.
  6. J. Tian, X. Sun, Y. Du, S. Zhao, Q. Liu, K. Zhang, W. Yi, W. Huang, C. Wang, X. Wu et al., “Recent advances for quantum neural networks in generative learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023.
  7. J. Biamonte, P. Wittek, N. Pancotti, P. Rebentrost, N. Wiebe, and S. Lloyd, “Quantum machine learning,” Nature, vol. 549, no. 7671, pp. 195–202, 2017.
  8. M. Schuld and N. Killoran, “Quantum machine learning in feature hilbert spaces,” Physical review letters, vol. 122, no. 4, p. 040504, 2019.
  9. M. Cerezo, A. Arrasmith, R. Babbush, S. C. Benjamin, S. Endo, K. Fujii, J. R. McClean, K. Mitarai, X. Yuan, L. Cincio et al., “Variational quantum algorithms,” Nature Reviews Physics, vol. 3, no. 9, pp. 625–644, 2021.
  10. M. Cerezo, G. Verdon, H.-Y. Huang, L. Cincio, and P. J. Coles, “Challenges and opportunities in quantum machine learning,” Nature Computational Science, vol. 2, no. 9, pp. 567–576, 2022.
  11. J. Shi, W. Wang, X. Lou, S. Zhang, and X. Li, “Parameterized hamiltonian learning with quantum circuit,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 5, pp. 6086–6095, 2022.
  12. S. Lloyd and C. Weedbrook, “Quantum generative adversarial learning,” Physical review letters, vol. 121, no. 4, p. 040502, 2018.
  13. P.-L. Dallaire-Demers and N. Killoran, “Quantum generative adversarial networks,” Physical Review A, vol. 98, no. 1, p. 012324, 2018.
  14. H. Situ, Z. He, Y. Wang, L. Li, and S. Zheng, “Quantum generative adversarial network for generating discrete distribution,” Information Sciences, vol. 538, pp. 193–208, 2020.
  15. M. Benedetti, D. Garcia-Pintos, O. Perdomo, V. Leyton-Ortega, Y. Nam, and A. Perdomo-Ortiz, “A generative modeling approach for benchmarking and training shallow quantum circuits,” npj Quantum Information, vol. 5, no. 1, p. 45, 2019.
  16. J.-G. Liu and L. Wang, “Differentiable learning of quantum circuit born machines,” Physical Review A, vol. 98, no. 6, p. 062324, 2018.
  17. A. Khoshaman, W. Vinci, B. Denis, E. Andriyash, H. Sadeghi, and M. H. Amin, “Quantum variational autoencoder,” Quantum Science and Technology, vol. 4, no. 1, p. 014001, 2018.
  18. M. H. Amin, E. Andriyash, J. Rolfe, B. Kulchytskyy, and R. Melko, “Quantum boltzmann machine,” Physical Review X, vol. 8, no. 2, p. 021050, 2018.
  19. I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative adversarial nets,” Advances in neural information processing systems, vol. 27, 2014.
  20. S. Chakrabarti, H. Yiming, T. Li, S. Feizi, and X. Wu, “Quantum wasserstein generative adversarial networks,” Advances in Neural Information Processing Systems, vol. 32, 2019.
  21. M. Y. Niu, A. Zlokapa, M. Broughton, S. Boixo, M. Mohseni, V. Smelyanskyi, and H. Neven, “Entangling quantum generative adversarial networks,” Physical Review Letters, vol. 128, no. 22, p. 220505, 2022.
  22. H.-L. Huang, Y. Du, M. Gong, Y. Zhao, Y. Wu, C. Wang, S. Li, F. Liang, J. Lin, Y. Xu et al., “Experimental quantum generative adversarial networks for image generation,” Physical Review Applied, vol. 16, no. 2, p. 024051, 2021.
  23. D. Silver, T. Patel, W. Cutler, A. Ranjan, H. Gandhi, and D. Tiwari, “Mosaiq: Quantum generative adversarial networks for image generation on nisq computers,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023, pp. 7030–7039.
  24. S. L. Tsang, M. T. West, S. M. Erfani, and M. Usman, “Hybrid quantum-classical generative adversarial network for high resolution image generation,” IEEE Transactions on Quantum Engineering, 2023.
  25. J. Zeng, Y. Wu, J.-G. Liu, L. Wang, and J. Hu, “Learning and inference on generative adversarial quantum circuits,” Physical Review A, vol. 99, no. 5, p. 052306, 2019.
  26. S. Chaudhary, P. Huembeli, I. MacCormack, T. L. Patti, J. Kossaifi, and A. Galda, “Towards a scalable discrete quantum generative adversarial neural network,” Quantum Science and Technology, vol. 8, no. 3, p. 035002, 2023.
  27. P. Braccia, F. Caruso, and L. Banchi, “How to enhance quantum generative adversarial learning of noisy information,” New Journal of Physics, vol. 23, no. 5, p. 053024, 2021.
  28. J. Sohl-Dickstein, E. Weiss, N. Maheswaranathan, and S. Ganguli, “Deep unsupervised learning using nonequilibrium thermodynamics,” in International conference on machine learning.   PMLR, 2015, pp. 2256–2265.
  29. J. Ho, A. Jain, and P. Abbeel, “Denoising diffusion probabilistic models,” Advances in neural information processing systems, vol. 33, pp. 6840–6851, 2020.
  30. J. Song, C. Meng, and S. Ermon, “Denoising diffusion implicit models,” in International Conference on Learning Representations, 2020.
  31. A. Q. Nichol and P. Dhariwal, “Improved denoising diffusion probabilistic models,” in International Conference on Machine Learning.   PMLR, 2021, pp. 8162–8171.
  32. P. Dhariwal and A. Nichol, “Diffusion models beat gans on image synthesis,” Advances in neural information processing systems, vol. 34, pp. 8780–8794, 2021.
  33. C. Luo, “Understanding diffusion models: A unified perspective,” arxiv:2208.11970, 2022.
  34. Y. Du, M.-H. Hsieh, T. Liu, and D. Tao, “Expressive power of parametrized quantum circuits,” Physical Review Research, vol. 2, no. 3, p. 033125, 2020.
  35. D. Zhu, N. M. Linke, M. Benedetti, K. A. Landsman, N. H. Nguyen, C. H. Alderete, A. Perdomo-Ortiz, N. Korda, A. Garfoot, C. Brecque et al., “Training of quantum circuits on a hybrid quantum computer,” Science advances, vol. 5, no. 10, p. eaaw9918, 2019.
  36. B. Coyle, D. Mills, V. Danos, and E. Kashefi, “The born supremacy: quantum advantage and training of an ising born machine,” npj Quantum Information, vol. 6, no. 1, p. 60, 2020.
  37. O. Kiss, M. Grossi, E. Kajomovitz, and S. Vallecorsa, “Conditional born machine for monte carlo event generation,” Physical Review A, vol. 106, no. 2, p. 022612, 2022.
  38. B. Coyle, M. Henderson, J. C. J. Le, N. Kumar, M. Paini, and E. Kashefi, “Quantum versus classical generative modelling in finance,” Quantum Science and Technology, vol. 6, no. 2, p. 024013, 2021.
  39. J. Alcazar, V. Leyton-Ortega, and A. Perdomo-Ortiz, “Classical versus quantum models in machine learning: insights from a finance application,” Machine Learning: Science and Technology, vol. 1, no. 3, p. 035003, 2020.
  40. E. Y. Zhu, S. Johri, D. Bacon, M. Esencan, J. Kim, M. Muir, N. Murgai, J. Nguyen, N. Pisenti, A. Schouela et al., “Generative quantum learning of joint probability distribution functions,” Physical Review Research, vol. 4, no. 4, p. 043092, 2022.
  41. S. E. Fahlman, G. E. Hinton, and T. J. Sejnowski, “Massively parallel architectures for al: Netl, thistle, and boltzmann machines,” in National Conference on Artificial Intelligence, AAAI, 1983.
  42. D. H. Ackley, G. E. Hinton, and T. J. Sejnowski, “A learning algorithm for boltzmann machines,” Cognitive science, vol. 9, no. 1, pp. 147–169, 1985.
  43. M. Kieferová and N. Wiebe, “Tomography and generative training with quantum boltzmann machines,” Physical Review A, vol. 96, no. 6, p. 062327, 2017.
  44. D. Crawford, A. Levit, N. Ghadermarzy, J. S. Oberoi, and P. Ronagh, “Reinforcement learning using quantum boltzmann machines,” arxiv:1612.05695, 2016.
  45. C. Zoufal, A. Lucchi, and S. Woerner, “Variational quantum boltzmann machines,” Quantum Machine Intelligence, vol. 3, pp. 1–15, 2021.
  46. J. T. Rolfe, “Discrete variational autoencoders,” arxiv:1609.02200, 2016.
  47. N. Gao, M. Wilson, T. Vandal, W. Vinci, R. Nemani, and E. Rieffel, “High-dimensional similarity search with quantum-assisted variational autoencoder,” in Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, 2020, pp. 956–964.
  48. J. Li and S. Ghosh, “Scalable variational quantum circuits for autoencoder-based drug discovery,” in 2022 Design, Automation & Test in Europe Conference & Exhibition (DATE).   IEEE, 2022, pp. 340–345.
  49. M. Benedetti, E. Grant, L. Wossnig, and S. Severini, “Adversarial quantum circuit learning for pure state approximation,” New Journal of Physics, vol. 21, no. 4, p. 043023, 2019.
  50. M. Parigi, S. Martina, and F. Caruso, “Quantum-noise-driven generative diffusion models,” arxiv:2308.12013, 2023.
  51. B. Zhang, P. Xu, X. Chen, and Q. Zhuang, “Generative quantum machine learning via denoising diffusion probabilistic models,” arxiv:2310.05866, 2023.
  52. A. Cacioppo, L. Colantonio, S. Bordoni, and S. Giagu, “Quantum diffusion models,” arxiv:2311.15444, 2023.
  53. W. H. Zurek, “Decoherence, einselection, and the quantum origins of the classical,” Reviews of modern physics, vol. 75, no. 3, p. 715, 2003.
  54. R. Horodecki, P. Horodecki, M. Horodecki, and K. Horodecki, “Quantum entanglement,” Reviews of modern physics, vol. 81, no. 2, p. 865, 2009.
  55. A. Streltsov, G. Adesso, and M. B. Plenio, “Colloquium: Quantum coherence as a resource,” Reviews of Modern Physics, vol. 89, no. 4, p. 041003, 2017.
  56. I. M. Georgescu, S. Ashhab, and F. Nori, “Quantum simulation,” Reviews of Modern Physics, vol. 86, no. 1, p. 153, 2014.
  57. D. P. Kingma and M. Welling, “Auto-encoding variational bayes,” arxiv:1312.6114, 2013.
  58. D. Kingma, T. Salimans, B. Poole, and J. Ho, “Variational diffusion models,” Advances in neural information processing systems, vol. 34, pp. 21 696–21 707, 2021.
  59. E. Stoudenmire and D. J. Schwab, “Supervised learning with tensor networks,” Advances in neural information processing systems, vol. 29, 2016.
  60. E. Grant, M. Benedetti, S. Cao, A. Hallam, J. Lockhart, V. Stojevic, A. G. Green, and S. Severini, “Hierarchical quantum classifiers,” npj Quantum Information, vol. 4, no. 1, p. 65, 2018.
  61. P. Rebentrost, M. Mohseni, and S. Lloyd, “Quantum support vector machine for big data classification,” Physical review letters, vol. 113, no. 13, p. 130503, 2014.
  62. Z. He, L. Li, S. Zheng, Y. Li, and H. Situ, “Variational quantum compiling with double q-learning,” New Journal of Physics, vol. 23, no. 3, p. 033002, 2021.
  63. M. Ostaszewski, L. M. Trenkwalder, W. Masarczyk, E. Scerri, and V. Dunjko, “Reinforcement learning for optimization of variational quantum circuit architectures,” Advances in Neural Information Processing Systems, vol. 34, pp. 18 182–18 194, 2021.
  64. S.-X. Zhang, C.-Y. Hsieh, S. Zhang, and H. Yao, “Differentiable quantum architecture search,” Quantum Science and Technology, vol. 7, no. 4, p. 045023, 2022.
  65. Y. Du, T. Huang, S. You, M.-H. Hsieh, and D. Tao, “Quantum circuit architecture search for variational quantum algorithms,” npj Quantum Information, vol. 8, no. 1, p. 62, 2022.
  66. Z. He, C. Chen, L. Li, S. Zheng, and H. Situ, “Quantum architecture search with meta-learning,” Advanced Quantum Technologies, vol. 5, no. 8, p. 2100134, 2022.
  67. Z. He, M. Deng, S. Zheng, L. Li, and H. Situ, “Gsqas: Graph self-supervised quantum architecture search,” Physica A: Statistical Mechanics and its Applications, vol. 630, p. 129286, 2023.
  68. Z. He, X. Zhang, C. Chen, Z. Huang, Y. Zhou, and H. Situ, “A gnn-based predictor for quantum architecture search,” Quantum Information Processing, vol. 22, no. 2, p. 128, 2023.
  69. V. Bergholm, J. Izaac, M. Schuld, C. Gogolin, S. Ahmed, V. Ajith, M. S. Alam, G. Alonso-Linaje, B. AkashNarayanan, A. Asadi et al., “Pennylane: Automatic differentiation of hybrid quantum-classical computations,” arxiv:1811.04968, 2018.
  70. S.-X. Zhang, J. Allcock, Z.-Q. Wan, S. Liu, J. Sun, H. Yu, X.-H. Yang, J. Qiu, Z. Ye, Y.-Q. Chen et al., “Tensorcircuit: a quantum software framework for the nisq era,” Quantum, vol. 7, p. 912, 2023.
  71. M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, C. Citro, G. S. Corrado, A. Davis, J. Dean, M. Devin, S. Ghemawat, I. Goodfellow, A. Harp, G. Irving, M. Isard, Y. Jia, R. Jozefowicz, L. Kaiser, M. Kudlur, J. Levenberg, D. Mané, R. Monga, S. Moore, D. Murray, C. Olah, M. Schuster, J. Shlens, B. Steiner, I. Sutskever, K. Talwar, P. Tucker, V. Vanhoucke, V. Vasudevan, F. Viégas, O. Vinyals, P. Warden, M. Wattenberg, M. Wicke, Y. Yu, and X. Zheng, “TensorFlow: Large-scale machine learning on heterogeneous systems,” 2015, software available from tensorflow.org. [Online]. Available: https://www.tensorflow.org/
  72. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arxiv:1412.6980, 2014.
Citations (1)

Summary

We haven't generated a summary for this paper yet.