Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MNISQ: A Large-Scale Quantum Circuit Dataset for Machine Learning on/for Quantum Computers in the NISQ era (2306.16627v1)

Published 29 Jun 2023 in quant-ph and cs.LG

Abstract: We introduce the first large-scale dataset, MNISQ, for both the Quantum and the Classical Machine Learning community during the Noisy Intermediate-Scale Quantum era. MNISQ consists of 4,950,000 data points organized in 9 subdatasets. Building our dataset from the quantum encoding of classical information (e.g., MNIST dataset), we deliver a dataset in a dual form: in quantum form, as circuits, and in classical form, as quantum circuit descriptions (quantum programming language, QASM). In fact, also the Machine Learning research related to quantum computers undertakes a dual challenge: enhancing machine learning exploiting the power of quantum computers, while also leveraging state-of-the-art classical machine learning methodologies to help the advancement of quantum computing. Therefore, we perform circuit classification on our dataset, tackling the task with both quantum and classical models. In the quantum endeavor, we test our circuit dataset with Quantum Kernel methods, and we show excellent results up to $97\%$ accuracy. In the classical world, the underlying quantum mechanical structures within the quantum circuit data are not trivial. Nevertheless, we test our dataset on three classical models: Structured State Space sequence model (S4), Transformer and LSTM. In particular, the S4 model applied on the tokenized QASM sequences reaches an impressive $77\%$ accuracy. These findings illustrate that quantum circuit-related datasets are likely to be quantum advantageous, but also that state-of-the-art machine learning methodologies can competently classify and recognize quantum circuits. We finally entrust the quantum and classical machine learning community the fundamental challenge to build more quantum-classical datasets like ours and to build future benchmarks from our experiments. The dataset is accessible on GitHub and its circuits are easily run in qulacs or qiskit.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Quantum Computation and Quantum Information. Cambridge University Press, 2010.
  2. Peter W. Shor. Algorithms for quantum computation: discrete logarithms and factoring. Proceedings of the 35th Annual Symposium on Foundations of Computer Science, pages 124–134, 1994.
  3. Seth Lloyd. Universal quantum simulators. Science, 273(5278):1073–1078, 1996.
  4. Quantum algorithm for linear systems of equations. Physical Review Letters, 103(15):150502, 2009.
  5. Quantum supremacy using a programmable superconducting processor. Nature, 574(7779):505–510, 2019.
  6. Strong quantum computational advantage using a superconducting quantum processor. Physical review letters, 127(18):180501, 2021.
  7. Quantum computational advantage via 60-qubit 24-cycle random circuit sampling. Science bulletin, 67(3):240–245, 2022.
  8. Phase transition in random circuit sampling. arXiv preprint arXiv:2304.11119, 2023.
  9. John Preskill. Quantum computing in the nisq era and beyond. Quantum, 2:79, 2018.
  10. Variational quantum algorithms. Nature Reviews Physics, 3(9):625–644, 2021.
  11. Quantum machine learning. Nature, 549(7671):195–202, Sep 2017.
  12. Challenges and opportunities in quantum machine learning. Nature Computational Science, 2:567–576, September 2022.
  13. OpenAI. ChatGPT: OpenAI’s Language Model. Online, 2023.
  14. Quantum advantage in learning from experiments. Science, 376(6598):1182–1186, 2022.
  15. A rigorous and robust quantum speed-up in supervised machine learning. Nature Physics, 17(9):1013–1017, 2021.
  16. Entangled datasets for quantum machine learning. arXiv preprint arXiv:2109.03400, 2021.
  17. Openqasm 3: A broader and deeper quantum assembly language. ACM Transactions on Quantum Computing, 3(3):1–60, 2022. Version: 2, submitted on 30 Apr 2021, last revised on 16 Mar 2022.
  18. Vqe-generated quantum circuit dataset for machine learning. arXiv preprint arXiv:2302.09751, 2023.
  19. Pennylane: Automatic differentiation of hybrid quantum-classical computations. arXiv preprint, 2022. Version: 4, submitted on 12 Nov 2018, last revised on 29 Jul 2022.
  20. Automatic quantum circuit encoding of a given arbitrary quantum state. arXiv preprint arXiv:2112.14524, 2021.
  21. Supervised learning with quantum-enhanced feature spaces. Nature, 567(7747):209–212, mar 2019.
  22. Experimental quantum kernel machine learning with nuclear spins in a solid. arXiv preprint arXiv:1911.12021, 2019.
  23. Kernel methods in quantum machine learning. Quantum Machine Intelligence, 1:65 – 71, 2019.
  24. Efficiently modeling long sequences with structured state spaces, 2022.
  25. Attention is all you need. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017.
  26. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
  27. The mnist database of handwritten digits. 2005.
  28. Imagenet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition, pages 248–255, 2009.
  29. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms, 2017.
  30. Tarin Clanuwat et al. Deep learning for classical japanese literature. arXiv preprint arXiv:1812.01718, 2018.
  31. Openml: networked science in machine learning. SIGKDD Explorations, 15(2):49–60, 2013.
  32. Qiskit contributors. Qiskit: An open-source framework for quantum computing, 2023.
  33. Qulacs: a fast and versatile quantum circuit simulator for research purpose. Quantum, 5:559, oct 2021.
  34. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.
  35. Christopher M. Bishop. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, Berlin, Heidelberg, 2006.
Citations (6)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com