Papers
Topics
Authors
Recent
2000 character limit reached

Introducing Reduced-Width QNNs, an AI-inspired Ansatz Design Pattern (2306.05047v3)

Published 8 Jun 2023 in quant-ph and cs.ET

Abstract: Variational Quantum Algorithms are one of the most promising candidates to yield the first industrially relevant quantum advantage. Being capable of arbitrary function approximation, they are often referred to as Quantum Neural Networks (QNNs) when being used in analog settings as classical Artificial Neural Networks (ANNs). Similar to the early stages of classical machine learning, known schemes for efficient architectures of these networks are scarce. Exploring beyond existing design patterns, we propose a reduced-width circuit ansatz design, which is motivated by recent results gained in the analysis of dropout regularization in QNNs. More precisely, this exploits the insight, that the gates of overparameterized QNNs can be pruned substantially until their expressibility decreases. The results of our case study show, that the proposed design pattern can significantly reduce training time while maintaining the same result quality as the standard "full-width" design in the presence of noise.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. The power of quantum neural networks. Nat. Comput. Sci., 1(6):403–409.
  2. Training deep quantum neural networks. Nat. Commun., 11(1):808.
  3. Impact of quantum noise on the training of quantum generative adversarial networks. In Journal of Physics: Conference Series, volume 2438, page 012093. IOP Publishing.
  4. Variational quantum algorithms. Nat. Rev. Phys., 3(9):625–644.
  5. Variational quantum algorithms. Nature Reviews Physics, 3(9):625–644.
  6. Quantum circuit architecture search for variational quantum algorithms. npj Quantum Inf., 8(1):62.
  7. On random graphs. i. Publicationes Mathematicae Debrecen.
  8. Quantum circuit optimization with deep reinforcement learning. arXiv preprint arXiv:2103.07585.
  9. Effects of noise on the overparametrization of quantum neural networks. arXiv preprint arXiv:2302.05059.
  10. Grover, L. K. (1996). A fast quantum mechanical algorithm for database search. In Proceedings of the Twenty-Eighth Annual ACM Symposium on Theory of Computing, STOC ’96, page 212–219, New York, NY, USA. Association for Computing Machinery.
  11. From the quantum approximate optimization algorithm to a quantum alternating operator ansatz. Algorithms, 12(2).
  12. Quantum algorithm for linear systems of equations. Phys. Rev. Lett., 103:150502.
  13. Benchmarking quantum logic operations relative to thresholds for fault tolerance. npj Quantum Inf., 9(1):109.
  14. Supervised learning with quantum-enhanced feature spaces. Nature, 567(7747):209–212.
  15. Hochreiter, S. (1998). The vanishing gradient problem during learning recurrent neural nets and problem solutions. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 6(02):107–116.
  16. Long short-term memory. Neural computation, 9(8):1735–1780.
  17. Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum, 3(1):010313.
  18. Overfitting in quantum machine learning and entangling dropout. Quantum Machine Intelligence, 4(2):30.
  19. Lanczos, C. (2012). The Variational Principles of Mechanics. Dover Books on Physics. Dover Publications.
  20. Layer vqe: A variational approach for combinatorial optimization on noisy quantum computers. IEEE Transactions on Quantum Engineering, 3:1–20.
  21. Lucas, A. (2014). Ising formulations of many np problems. Frontiers in Physics, 2.
  22. Benchmarking quantum coprocessors in an application-centric, hardware-agnostic, and scalable way. IEEE Transactions on Quantum Engineering, 2:1–11.
  23. Barren plateaus in quantum neural network training landscapes. Nature communications, 9(1):4812.
  24. Quantum circuit learning. Phys. Rev. A, 98:032309.
  25. A variational eigenvalue solver on a photonic quantum processor. Nat. Commun., 5(1):4213.
  26. Powell, M. J. (1994). A direct search optimization method that models the objective and constraint functions by linear interpolation. Springer.
  27. Preskill, J. (2018). Quantum computing in the nisq era and beyond. Quantum, 2:79.
  28. A general approach to dropout in quantum neural networks. Advanced Quantum Technologies, page 2300220.
  29. Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural Networks, 61:85–117.
  30. Evaluating analytic gradients on quantum hardware. Phys. Rev. A, 99:032331.
  31. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Phys. Rev. A, 103:032430.
  32. Shor, P. (1994). Algorithms for quantum computation: discrete logarithms and factoring. In Proceedings 35th Annual Symposium on Foundations of Computer Science, pages 124–134.
  33. Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms. Advanced Quantum Technologies, 2(12):1900070.
  34. Layerwise learning for quantum neural networks. Quantum Machine Intelligence, 3:1–11.
  35. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(56):1929–1958.
  36. Approximative lookup-tables and arbitrary function rotations for facilitating nisq-implementations of the hhl and beyond. In 2023 IEEE International Conference on Quantum Computing and Engineering (QCE), volume 01, pages 151–160.
  37. Quantumnat: Quantum noise-aware training with noise injection, quantization and normalization. In Proceedings of the 59th ACM/IEEE Design Automation Conference, DAC ’22, page 1–6, New York, NY, USA. Association for Computing Machinery.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.