Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transformer models as an efficient replacement for statistical test suites to evaluate the quality of random numbers (2405.03904v2)

Published 6 May 2024 in cs.LG

Abstract: Random numbers are incredibly important in a variety of fields, and the need for their validation remains important for safety. A Quantum Random Number Generator (QRNG) can theoretically generate truly random numbers, however their quality still needs to be thoroughly validated. Generally, the task of validating random numbers has been delegated to different statistical tests such as the tests from the NIST Statistical Test Suite (STS), which are often slow and only perform one test at a time. Our work presents a deep learning model utilizing the Transformer architecture that 1) performs multiple NIST STS tests at once, and 2) runs much faster. This model outputs multi-label classification results on passing these statistical tests. We performed a thorough hyper-parameter optimization to converge on the best possible model and as a result, achieved a high degree of accuracy with a Macro F1-score of above 0.96. We also compared this model to a conventional deep learning method (Long Short Term Memory Recurrent Neural Networks) to quantify randomness and showed our model achieved similar performances while being much more efficient and scalable. The high performance and efficiency of this Transformer-based deep learning model showed that it can be a viable replacement for the NIST STS for validating random numbers.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. Design and implementation of a modified aes cryptography with fast key generation technique. In 2020 IEEE International Women in Engineering (WIE) Conference on Electrical and Computer Engineering (WIECON-ECE), pages 191–195. IEEE, 2020.
  2. Sigmoidf1: A smooth f1 score surrogate loss for multilabel classification. arXiv preprint arXiv:2108.10566, 2021.
  3. Randomness in quantum mechanics: philosophy, physics and technology. Reports on Progress in Physics, 80(12):124001, 2017.
  4. Transformers and large language models for chemistry and drug discovery. arXiv preprint arXiv:2310.06083, 2023.
  5. Dieharder. Duke University Physics Department Durham, NC, pages 27708–0305, 2018.
  6. Ensuring high-quality randomness in cryptographic key generation. In Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security, pages 685–696, 2013.
  7. On the (im) possibility of cryptography with imperfect randomness. In 45th Annual IEEE Symposium on Foundations of Computer Science, pages 196–205. IEEE, 2004.
  8. Testing randomness using artificial neural network. IEEE Access, 8:163685–163693, 2020.
  9. Transformers for modeling physical systems. Neural Networks, 146:272–289, 2022.
  10. Rosario Gennaro. Randomness in cryptography. IEEE security & privacy, 4(2):64–67, 2006.
  11. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
  12. Improving transformer optimization through better initialization. In International Conference on Machine Learning, pages 4475–4483. PMLR, 2020.
  13. Transformers in vision: A survey. ACM computing surveys (CSUR), 54(10s):1–41, 2022.
  14. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  15. Deep learning-based security verification for a random number generator using white chaos. Entropy, 22(10):1134, 2020.
  16. Quantify randomness of quantum random number with transformer network. In 2023 3rd International Conference on Intelligent Power and Systems (ICIPS), pages 17–22, 2023.
  17. A survey of transformers. AI open, 3:111–132, 2022.
  18. George Marsaglia. The marsaglia random number cdrom including the diehard battery of tests of randomness. http://www. stat. fsu. edu/pub/diehard/, 2008.
  19. Stephen Merity. Single headed attention rnn: Stop thinking with your head. arXiv preprint arXiv:1911.11423, 2019.
  20. Randomness testing with neural networks. In 2021 IEEE 17th international conference on intelligent computer communication and processing (ICCP), pages 431–436. IEEE, 2021.
  21. Entropy: From thermodynamics to information processing. Entropy, 23(10):1340, 2021.
  22. Transformers in the real world: A survey on nlp applications. Information, 14(4):242, 2023.
  23. Training tips for the transformer model. arXiv preprint arXiv:1804.00247, 2018.
  24. A statistical test suite for random and pseudorandom number generators for cryptographic applications, volume 22. US Department of Commerce, Technology Administration, National Institute of …, 2001.
  25. Alexander Shen. Randomness tests: theory and practice. In Fields of Logic and Computation III: Essays Dedicated to Yuri Gurevich on the Occasion of His 80th Birthday, pages 258–290. Springer, 2020.
  26. Stan Sokorac. Optimizing random test constraints using machine learning algorithms. In Proceedings of the design and verification conference and exhibition US (DVCon), 2017.
  27. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  28. Self-attention with functional time representation learning. Advances in neural information processing systems, 32, 2019.
  29. Probabilistic extension of precision, recall, and f1 score for more thorough evaluation of classification models. In Proceedings of the first workshop on evaluation and comparison of NLP systems, pages 79–91, 2020.
  30. Vsa: Learning varied-size window attention in vision transformers. In European conference on computer vision, pages 466–483. Springer, 2022.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com