Noise-Aware Training of Neuromorphic Dynamic Device Networks (2401.07387v2)
Abstract: Physical computing has the potential to enable widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices provide basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing networks to perform dynamic tasks is challenging without physical models and accurate quantification of device noise. We propose a novel, noise-aware methodology for training device networks using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins, accurately capturing the dynamics and associated stochasticity of devices with intrinsic memory. Our approach employs backpropagation through time and cascade learning, allowing networks to effectively exploit the temporal properties of physical devices. We validate our method on diverse networks of spintronic devices across temporal classification and regression benchmarks. By decoupling the training of individual device models from network training, our method reduces the required training data and provides a robust framework for programming dynamical devices without relying on analytical descriptions of their dynamics.
- Bush, V. The differential analyzer. a new machine for solving differential equations. \JournalTitleJournal of the Franklin Institute 212, 447–488 (1931).
- Mindell, D. A. Automation’s finest hour: Bell labs and automatic control in world war ii. \JournalTitleIEEE Control Systems Magazine 15, 72 (1995).
- Lundberg, K. H. The history of analog computing: introduction to the special section. \JournalTitleIEEE Control Systems Magazine 25, 22–25 (2005).
- Schaller, R. R. Moore’s law: past, present and future. \JournalTitleIEEE spectrum 34, 52–59 (1997).
- Merolla, P. A. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. \JournalTitleScience 345, 668–673 (2014).
- Davies, M. et al. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. \JournalTitleIEEE Micro 38, 82–99, DOI: 10.1109/MM.2018.112130359 (2018). Conference Name: IEEE Micro.
- The next generation of deep learning hardware: Analog computing. \JournalTitleProceedings of the IEEE 107, 108–122 (2018).
- Energy and policy considerations for deep learning in NLP. In Korhonen, A., Traum, D. & Màrquez, L. (eds.) Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 3645–3650, DOI: 10.18653/v1/P19-1355 (Association for Computational Linguistics, Florence, Italy, 2019).
- Physics for neuromorphic computing. \JournalTitleNature Reviews Physics 2, 499–510, DOI: 10.1038/s42254-020-0208-2 (2020). Number: 9 Publisher: Nature Publishing Group.
- Memory devices and applications for in-memory computing. \JournalTitleNature Nanotechnology 15, 529–544, DOI: 10.1038/s41565-020-0655-z (2020). Number: 7 Publisher: Nature Publishing Group.
- Mehonic, A. et al. Memristors—From In-Memory Computing, Deep Learning Acceleration, and Spiking Neural Networks to the Future of Neuromorphic and Bio-Inspired Computing. \JournalTitleAdvanced Intelligent Systems 2, 2000085, DOI: 10.1002/AISY.202000085 (2020). ArXiv: 2004.14942 Publisher: John Wiley & Sons, Ltd.
- Equilibrium Propagation for Memristor-Based Recurrent Neural Networks. \JournalTitleFrontiers in Neuroscience 14 (2020).
- Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation. \JournalTitleFrontiers in Computational Neuroscience 11 (2017).
- Ross, A. et al. Multilayer spintronic neural networks with radiofrequency connections. \JournalTitleNature Nanotechnology DOI: 10.1038/s41565-023-01452-w (2023).
- Wright, L. G. et al. Deep physical neural networks trained with backpropagation. \JournalTitleNature 601, 549–555, DOI: 10.1038/s41586-021-04223-6 (2022). Number: 7894 Publisher: Nature Publishing Group.
- Jaeger, H. The “echo state” approach to analysing and training recurrent neural networks- with an erratum note. Tech. Rep., German National Research Center for Information Technology, Bonn, Germany (2001). Publication Title: GMD Technical Report.
- Reservoir Computing with Spin Waves Excited in a Garnet Film. \JournalTitleIEEE Access 6, 4462–4469, DOI: 10.1109/ACCESS.2018.2794584 (2018). Publisher: Institute of Electrical and Electronics Engineers Inc.
- Torrejon, J. et al. Neuromorphic computing with nanoscale spintronic oscillators. \JournalTitleNature 547, 428–431, DOI: 10.1038/nature23011 (2017). ArXiv: 1701.07715 Publisher: Nature Publishing Group.
- Paquot, Y. et al. Optoelectronic reservoir computing. \JournalTitleScientific Reports 2, DOI: 10.1038/srep00287 (2012). ArXiv: 1111.7219.
- Neuromorphic metasurface. \JournalTitlePhotonics Research 8, 46–50, DOI: 10.1364/PRJ.8.000046 (2020). Publisher: Optica Publishing Group.
- Wave physics as an analog recurrent neural network. \JournalTitleScience Advances 5, eaay6946, DOI: 10.1126/sciadv.aay6946 (2019). https://www.science.org/doi/pdf/10.1126/sciadv.aay6946.
- Jaeger, H. Reservoir riddles: suggestions for echo state network research. In Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005., vol. 3, 1460–1462 vol. 3, DOI: 10.1109/IJCNN.2005.1556090 (2005).
- Neural ordinary differential equations. \JournalTitleAdvances in neural information processing systems 31 (2018).
- Chen, X. et al. Forecasting the outcome of spintronic experiments with neural ordinary differential equations. \JournalTitleNature communications 13, 1016 (2022).
- Dawidek, R. W. et al. Dynamically driven emergence in a nanomagnetic system. \JournalTitleAdvanced Functional Materials 31, 2008389 (2021).
- Vidamour, I. T. et al. Quantifying the computational capability of a nanomagnetic reservoir computing platform with emergent magnetisation dynamics. \JournalTitleNanotechnology 33, 485203 (2022).
- Vidamour, I. et al. Reconfigurable reservoir computing in a magnetic metamaterial. \JournalTitleCommunications Physics 6, 230 (2023).
- Gartside, J. C. et al. Reconfigurable training and reservoir computing in an artificial spin-vortex ice via spin-wave fingerprinting. \JournalTitleNature Nanotechnology 17, 460–469 (2022).
- Stenning, K. D. et al. Adaptive programmable networks for in materia neuromorphic computing. \JournalTitlearXiv preprint arXiv:2211.06373 (2022).
- Chen, X. et al. Forecasting the outcome of spintronic experiments with Neural Ordinary Differential Equations. \JournalTitleNature Communications 13, 1016, DOI: 10.1038/s41467-022-28571-7 (2022). Number: 1 Publisher: Nature Publishing Group.
- Werbos, P. J. Backpropagation through time: what it does and how to do it. \JournalTitleProceedings of the IEEE 78, 1550–1560 (1990).
- Bellec, G. et al. A solution to the learning dilemma for recurrent networks of spiking neurons. \JournalTitleNature communications 11, 3625 (2020).
- Bellec, G. et al. Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets. \JournalTitlearXiv preprint arXiv:1901.09049 (2019).
- Neural sdes as infinite-dimensional gans. In International conference on machine learning, 5453–5463 (PMLR, 2021).
- Zheng, T. et al. Parameters optimization method for the time-delayed reservoir computing with a nonlinear duffing mechanical oscillator. \JournalTitleScientific reports 11, 997 (2021).
- An information theoretic study of a duffing oscillator array reservoir computer. \JournalTitleJournal of Computational and Nonlinear Dynamics 16, 081004 (2021).
- Gartside, J. C. et al. Reconfigurable magnonic mode-hybridisation and spectral control in a bicomponent artificial spin ice. \JournalTitleNature Communications 12, 2488 (2021).
- Bellec, G. et al. A solution to the learning dilemma for recurrent networks of spiking neurons. \JournalTitleNature Communications 11, 3625, DOI: 10.1038/s41467-020-17236-y (2020). Number: 1 Publisher: Nature Publishing Group.
- Reservoir computing beyond memory-nonlinearity trade-off. \JournalTitleScientific reports 7, 10199 (2017).
- Improved training of wasserstein gans. \JournalTitleAdvances in neural information processing systems 30 (2017).
- Generative adversarial networks (gans) challenges, solutions, and future directions. \JournalTitleACM Computing Surveys (CSUR) 54, 1–42 (2021).