Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Fast Algorithm to Simulate Nonlinear Resistive Networks (2402.11674v2)

Published 18 Feb 2024 in cs.ET, cond-mat.dis-nn, and cs.LG

Abstract: Analog electrical networks have long been investigated as energy-efficient computing platforms for machine learning, leveraging analog physics during inference. More recently, resistor networks have sparked particular interest due to their ability to learn using local rules (such as equilibrium propagation), enabling potentially important energy efficiency gains for training as well. Despite their potential advantage, the simulations of these resistor networks has been a significant bottleneck to assess their scalability, with current methods either being limited to linear networks or relying on realistic, yet slow circuit simulators like SPICE. Assuming ideal circuit elements, we introduce a novel approach for the simulation of nonlinear resistive networks, which we frame as a quadratic programming problem with linear inequality constraints, and which we solve using a fast, exact coordinate descent algorithm. Our simulation methodology significantly outperforms existing SPICE-based simulations, enabling the training of networks up to 327 times larger at speeds 160 times faster, resulting in a 50,000-fold improvement in the ratio of network size to epoch duration. Our approach can foster more rapid progress in the simulations of nonlinear analog electrical networks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Frequency propagation: Multi-mechanism learning in nonlinear physical networks. arXiv preprint arXiv:2208.08862, 2022.
  2. A compositional framework for passive linear networks. Theory and Applications of Categories, 33(38):1158–1222, 2018.
  3. Demonstration of decentralized physics-driven learning. Physical Review Applied, 18(1):014040, 2022.
  4. Circuits that train themselves: decentralized, physics-driven learning. In AI and Optical Data Sciences IV, volume 12438, pp.  115–117. SPIE, 2023.
  5. Hopfield, J. J. Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the national academy of sciences, 81(10):3088–3092, 1984.
  6. Keiter, E. Xyce: An open source spice engine, Mar 2014. URL https://nanohub.org/resources/20605.
  7. Training end-to-end analog neural networks with equilibrium propagation. arXiv preprint arXiv:2006.01981, 2020.
  8. Impacts of feedback current value and learning rate on equilibrium propagation performance. In 2022 20th IEEE Interregional NEWCAS Conference (NEWCAS), pp.  519–523. IEEE, 2022.
  9. Holomorphic equilibrium propagation computes exact gradients through finite size oscillations. Advances in Neural Information Processing Systems, 35:12950–12963, 2022.
  10. Scaling equilibrium propagation to deep convnets by drastically reducing its gradient estimator bias. Frontiers in neuroscience, 15:129, 2021.
  11. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  12. Self-learning machines based on hamiltonian echo backpropagation. Physical Review X, 13(3):031020, 2023.
  13. Physics for neuromorphic computing. Nature Reviews Physics, 2(9):499–510, 2020.
  14. Millar, W. Cxvi. some general theorems for non-linear systems possessing resistance. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 42(333):1150–1160, 1951.
  15. Memristor crossbar circuits implementing equilibrium propagation for on-device learning. Micromachines, 14(7):1367, 2023.
  16. Onsager, L. Reciprocal relations in irreversible processes. ii. Physical review, 38(12):2265, 1931.
  17. Automatic differentiation in pytorch. 2017.
  18. Deep boltzmann machines. In Artificial intelligence and statistics, pp.  448–455. PMLR, 2009.
  19. Scellier, B. A deep learning theory for neural networks grounded in physics. PhD thesis, Université de Montréal, 2021.
  20. Equilibrium propagation: Bridging the gap between energy-based models and backpropagation. Frontiers in computational neuroscience, 11:24, 2017.
  21. A universal approximation theorem for nonlinear resistive networks. arXiv preprint arXiv:2312.15063, 2023.
  22. Agnostic physics-driven deep learning. arXiv preprint arXiv:2205.15021, 2022.
  23. Energy-based learning algorithms for analog computing: a comparative study. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  24. Learning without neurons in physical systems. Annual Review of Condensed Matter Physics, 14:417–441, 2023.
  25. Physical learning beyond the quasistatic limit. Physical Review Research, 4(2):L022037, 2022.
  26. Physical learning of power-efficient solutions. arXiv preprint arXiv:2310.10437, 2023.
  27. Ngspice (version 31), 2020. URL http://ngspice.sourceforge.net/docs/ngspice-manual.pdf.
  28. Energy-based analog neural network framework. Frontiers in Computational Neuroscience, 17:1114651, 2023.
  29. Deep physical neural networks trained with backpropagation. Nature, 601(7894):549–555, 2022.
  30. Wright, S. J. Coordinate descent algorithms. Mathematical programming, 151(1):3–34, 2015.
  31. Desynchronous learning in a physics-driven learning network. The Journal of Chemical Physics, 156(14), 2022.
  32. Memristive crossbar arrays for brain-inspired computing. Nature materials, 18(4):309–323, 2019.
  33. Activity-difference training of deep neural networks using memristor crossbars. Nature Electronics, 6(1):45–51, 2023.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com