Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Memristive Reservoirs Learn to Learn (2306.12676v1)

Published 22 Jun 2023 in cond-mat.dis-nn and cs.AI

Abstract: Memristive reservoirs draw inspiration from a novel class of neuromorphic hardware known as nanowire networks. These systems display emergent brain-like dynamics, with optimal performance demonstrated at dynamical phase transitions. In these networks, a limited number of electrodes are available to modulate system dynamics, in contrast to the global controllability offered by neuromorphic hardware through random access memories. We demonstrate that the learn-to-learn framework can effectively address this challenge in the context of optimization. Using the framework, we successfully identify the optimal hyperparameters for the reservoir. This finding aligns with previous research, which suggests that the optimal performance of a memristive reservoir occurs at the `edge of formation' of a conductive pathway. Furthermore, our results show that these systems can mimic membrane potential behavior observed in spiking neurons, and may serve as an interface between spike-based and continuous processes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. Learning to Learn by Gradient Descent by Gradient Descent. https://doi.org/10.48550/arXiv.1606.04474 arXiv:arXiv:1606.04474
  2. Long Short-Term Memory and Learning-to-Learn in Networks of Spiking Neurons. arXiv:arXiv:1803.09574
  3. Neuromorphic Hardware Learns to Learn. Frontiers in Neuroscience 13 (2019).
  4. Resistance Random Access Memory. Materials Today 19, 5 (June 2016), 254–264. https://doi.org/10.1016/j.mattod.2015.11.009
  5. Atomic Switch Networks—Nanoarchitectonic Design of a Complex System for Natural Computing. Nanotechnology 26, 20 (April 2015), 204003. https://doi.org/10.1088/0957-4484/26/20/204003
  6. Emergent Dynamics of Neuromorphic Nanowire Networks. Scientific Reports 9, 1 (Dec. 2019), 14920. https://doi.org/10.1038/s41598-019-51330-6
  7. Dominik Dold. 2022. Relational representation learning with spike trains. In 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–8.
  8. Nanoscale Neuromorphic Networks and Criticality: A Perspective. Journal of Physics: Complexity 2, 4 (Dec. 2021), 042001. https://doi.org/10.1088/2632-072X/ac3ad3
  9. Memristor-based binarized spiking neural networks: Challenges and applications. IEEE Nanotechnology Magazine 16, 2 (2022), 14–23.
  10. Training Spiking Neural Networks Using Lessons From Deep Learning. (Sept. 2021). https://doi.org/10.48550/arXiv.2109.12894
  11. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks. arXiv:arXiv:1703.03400
  12. Reservoir Computing with Neuromemristive Nanowire Networks. In 2020 International Joint Conference on Neural Networks (IJCNN). 1–8. https://doi.org/10.1109/IJCNN48605.2020.9207727
  13. The elements of statistical learning: data mining, inference, and prediction.
  14. Learning to Learn Using Gradient Descent. In Artificial Neural Networks — ICANN 2001 (Lecture Notes in Computer Science), Georg Dorffner, Horst Bischof, and Kurt Hornik (Eds.). Springer, Berlin, Heidelberg, 87–94. https://doi.org/10.1007/3-540-44668-0_13
  15. Avalanches and Edge-of-Chaos Learning in Neuromorphic Nanowire Networks. Nature Communications 12, 1 (Dec. 2021), 4008. https://doi.org/10.1038/s41467-021-24260-z
  16. Meta-Learning in Neural Networks: A Survey. arXiv:2004.05439 [cs, stat]
  17. Daniele Ielmini and H-S Philip Wong. 2018. In-memory computing with resistive switching devices. Nature electronics 1, 6 (2018), 333–343.
  18. Herbert Jaeger. 2001. The “Echo State” Approach to Analysing and Training Recurrent Neural Networks – with an Erratum Note. (2001), 47.
  19. Herbert Jaeger. 2002. A Tutorial on Training Recurrent Neural Networks, Covering BPPT, RTRL, EKF and the ”Echo State Network” Approach. (Oct. 2002).
  20. Optimization by Simulated Annealing. Science 220, 4598 (1983), 671–680. arXiv:1690046
  21. Neuromorphic Information Processing with Nanowire Networks. In 2020 IEEE International Symposium on Circuits and Systems (ISCAS). 1–5. https://doi.org/10.1109/ISCAS45731.2020.9181034
  22. Zdenka Kuncic and Tomonobu Nakayama. 2021. Neuromorphic Nanowire Networks: Principles, Progress and Future Prospects for Neuro-Inspired Information Processing. Advances in Physics: X 6, 1 (Jan. 2021), 1894234. https://doi.org/10.1080/23746149.2021.1894234
  23. Spoken Digit Classification by In-Materio Reservoir Computing With Neuromorphic Atomic Switch Networks. Frontiers in Nanotechnology 3 (2021). https://doi.org/10.3389/fnano.2021.675792
  24. Neuromorphic Learning, Working Memory, and Metaplasticity in Nanowire Networks. SCIENCE ADVANCES (2023). https://doi.org/10.1126/sciadv.adg3289
  25. Modularity and Multitasking in Neuro-Memristive Reservoir Networks. Neuromorphic Computing and Engineering 1, 1 (Aug. 2021), 014003. https://doi.org/10.1088/2634-4386/ac156f
  26. Topological Properties of Neuromorphic Nanowire Networks. Frontiers in Neuroscience 14 (March 2020), 184. https://doi.org/10.3389/fnins.2020.00184
  27. Mantas Lukoševičius and Herbert Jaeger. 2009. Reservoir Computing Approaches to Recurrent Neural Network Training. Computer Science Review 3, 3 (Aug. 2009), 127–149. https://doi.org/10.1016/j.cosrev.2009.03.005
  28. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations. Neural Computation 14, 11 (Nov. 2002), 2531–2560. https://doi.org/10.1162/089976602760407955
  29. Connectome of Memristive Nanowire Networks through Graph Theory. Neural Networks 150 (June 2022), 137–148. https://doi.org/10.1016/j.neunet.2022.02.022
  30. In Materia Reservoir Computing with a Fully Memristive Architecture Based on Self-Organizing Nanowire Networks. Nature Materials (Oct. 2021). https://doi.org/10.1038/s41563-021-01099-9
  31. Ingo Rechenberg. 1973. Evolutionsstrategie. Optimierung technischer Systeme nach Prinzipien derbiologischen Evolution (1973).
  32. Evolution Strategies as a Scalable Alternative to Reinforcement Learning. https://doi.org/10.48550/arXiv.1703.03864 arXiv:arXiv:1703.03864
  33. Evolutionary Optimization for Neuromorphic Systems. In Proceedings of the Neuro-inspired Computational Elements Workshop. ACM, Heidelberg Germany, 1–9. https://doi.org/10.1145/3381755.3381758
  34. Parameter-Exploring Policy Gradients. Neural Networks 23, 4 (May 2010), 551–559. https://doi.org/10.1016/j.neunet.2009.12.004
  35. A Theoretical and Experimental Study of Neuromorphic Atomic Switch Networks for Reservoir Computing. Nanotechnology 24, 38 (Sept. 2013), 384004. https://doi.org/10.1088/0957-4484/24/38/384004
  36. Emergent Criticality in Complex Turing B-Type Atomic Switch Networks. Advanced Materials 24, 2 (2012), 286–293. https://doi.org/10.1002/adma.201103053
  37. Reservoirs Learn to Learn. 59–76. https://doi.org/10.1007/978-981-13-1687-6_3 arXiv:1909.07486 [cs]
  38. Natural Evolution Strategies. In 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence). IEEE, Hong Kong, China, 3381–3387. https://doi.org/10.1109/CEC.2008.4631255
  39. SSCAE: A Neuromorphic SNN Autoencoder for sc-RNA-seq Dimensionality Reduction. (2023).
  40. Information Dynamics in Neuromorphic Nanowire Networks. Scientific Reports 11, 1 (June 2021), 13047. https://doi.org/10.1038/s41598-021-92170-7
  41. Harnessing Adaptive Dynamics in Neuro-Memristive Nanowire Networks for Transfer Learning. In 2020 International Conference on Rebooting Computing (ICRC). 102–106. https://doi.org/10.1109/ICRC2020.2020.00007
  42. MNIST Classification Using Neuromorphic Nanowire Networks. In International Conference on Neuromorphic Systems 2021 (ICONS 2021). Association for Computing Machinery, New York, NY, USA, 1–4. https://doi.org/10.1145/3477145.3477162
  43. SpikeGPT: Generative Pre-trained Language Model with Spiking Neural Networks. arXiv preprint arXiv:2302.13939 (2023).
Citations (3)

Summary

We haven't generated a summary for this paper yet.