Optimizing Quantum Convolutional Neural Network Architectures for Arbitrary Data Dimension (2403.19099v1)
Abstract: Quantum convolutional neural networks (QCNNs) represent a promising approach in quantum machine learning, paving new directions for both quantum and classical data analysis. This approach is particularly attractive due to the absence of the barren plateau problem, a fundamental challenge in training quantum neural networks (QNNs), and its feasibility. However, a limitation arises when applying QCNNs to classical data. The network architecture is most natural when the number of input qubits is a power of two, as this number is reduced by a factor of two in each pooling layer. The number of input qubits determines the dimensions (i.e. the number of features) of the input data that can be processed, restricting the applicability of QCNN algorithms to real-world data. To address this issue, we propose a QCNN architecture capable of handling arbitrary input data dimensions while optimizing the allocation of quantum resources such as ancillary qubits and quantum gates. This optimization is not only important for minimizing computational resources, but also essential in noisy intermediate-scale quantum (NISQ) computing, as the size of the quantum circuits that can be executed reliably is limited. Through numerical simulations, we benchmarked the classification performance of various QCNN architectures when handling arbitrary input data dimensions on the MNIST and Breast Cancer datasets. The results validate that the proposed QCNN architecture achieves excellent classification performance while utilizing a minimal resource overhead, providing an optimal solution when reliable quantum computation is constrained by noise and imperfections.
- Artificial neural networks: A tutorial. Computer, 29(3):31–44, 1996.
- Attention is all you need. Advances in neural information processing systems, 30, 2017.
- Backpropagation applied to handwritten zip code recognition. Neural computation, 1(4):541–551, 1989.
- Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
- Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25, 2012.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 580–587, 2014.
- A convolutional neural network cascade for face detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 5325–5334, 2015.
- Convolutional neural networks for medical image analysis: Full training or fine tuning? IEEE transactions on medical imaging, 35(5):1299–1312, 2016.
- Quantum machine learning. Nature, 549(7671):195–202, 2017.
- Quantum machine learning in feature hilbert spaces. Physical review letters, 122(4):040504, 2019.
- An introduction to quantum machine learning. Contemporary Physics, 56(2):172–185, 2015.
- Quantum algorithms for supervised and unsupervised machine learning. arXiv preprint arXiv:1307.0411, 2013.
- John Preskill. Quantum computing in the nisq era and beyond. Quantum, 2:79, 2018.
- Noisy intermediate-scale quantum algorithms. Reviews of Modern Physics, 94(1):015004, 2022.
- Parameterized quantum circuits as machine learning models. Quantum Science and Technology, 4(4):043001, 2019.
- Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms. Advanced Quantum Technologies, 2(12):1900070, 2019.
- Variational quantum algorithms. Nature Reviews Physics, 3(9):625–644, 2021.
- Information-theoretic bounds on quantum advantage in machine learning. Physical Review Letters, 126(19):190505, 2021.
- Quantum algorithmic measurement. Nature communications, 13(1):887, 2022.
- Effect of data encoding on the expressive power of variational quantum-machine-learning models. Physical Review A, 103(3):032430, 2021.
- Encoding-dependent generalization bounds for parametrized quantum circuits. Quantum, 5:582, 2021.
- Exponential concentration and untrainability in quantum kernel methods,(2022). arXiv preprint arXiv:2208.11060.
- The power of quantum neural networks. Nature Computational Science, 1(6):403–409, 2021.
- Generalization in quantum machine learning from few training data. Nature communications, 13(1):4919, 2022.
- Barren plateaus in quantum neural network training landscapes. Nature communications, 9(1):4812, 2018.
- Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum, 3:010313, Jan 2022.
- Hierarchical quantum classifiers. npj Quantum Information, 4(1):65, 2018.
- Absence of barren plateaus in quantum convolutional neural networks. Physical Review X, 11(4):041011, 2021.
- Quantum convolutional neural networks. Nature Physics, 15(12):1273–1278, 2019.
- Quantum convolutional neural network for classical data classification. Quantum Machine Intelligence, 4(1):3, 2022.
- Classical-to-quantum convolutional neural network transfer learning. Neurocomputing, 555:126643, 2023.
- Hierarchical quantum circuit representations for neural architecture search. npj Quantum Information, 9(1):79, 2023.
- Quantum support vector data description for anomaly detection. arXiv preprint arXiv:2310.06375, 2023.
- Deep Learning. MIT Press, 2016. http://www.deeplearningbook.org.
- Understanding deep learning (still) requires rethinking generalization. Commun. ACM, 64(3):107–115, 2021.
- A universal approximation theorem of deep neural networks for expressing probability distributions. Advances in neural information processing systems, 33:3094–3105, 2020.
- Sebastian Ruder. An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747, 2016.
- Quantum circuit learning. Physical Review A, 98(3):032309, 2018.
- Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3):032331, 2019.
- Branching quantum convolutional neural networks. Physical Review Research, 4(1):013117, 2022.
- Transfer learning in hybrid classical-quantum neural networks. Quantum, 4:340, 2020.
- Optimal quantum circuits for general two-qubit gates. Physical Review A, 69(3):032315, 2004.
- Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
- Pennylane: Automatic differentiation of hybrid quantum-classical computations. arXiv preprint arXiv:1811.04968, 2018.