Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
103 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
50 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Strong anti-Hebbian plasticity alters the convexity of network attractor landscapes (2312.14896v1)

Published 22 Dec 2023 in cs.NE, cs.SY, eess.SY, math.DS, and q-bio.NC

Abstract: In this paper, we study recurrent neural networks in the presence of pairwise learning rules. We are specifically interested in how the attractor landscapes of such networks become altered as a function of the strength and nature (Hebbian vs. anti-Hebbian) of learning, which may have a bearing on the ability of such rules to mediate large-scale optimization problems. Through formal analysis, we show that a transition from Hebbian to anti-Hebbian learning brings about a pitchfork bifurcation that destroys convexity in the network attractor landscape. In larger-scale settings, this implies that anti-Hebbian plasticity will bring about multiple stable equilibria, and such effects may be outsized at interconnection or `choke' points. Furthermore, attractor landscapes are more sensitive to slower learning rates than faster ones. These results provide insight into the types of objective functions that can be encoded via different pairwise plasticity rules.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Brain mechanisms for perceptual and reward-related decision-making. Progress in neurobiology, 103:194–213, 2013.
  2. Attractor and integrator networks in the brain. Nature Reviews Neuroscience, 23(12):744–766, 2022.
  3. John J Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8):2554–2558, 1982.
  4. The capacity of the hopfield associative memory. IEEE transactions on Information Theory, 33(4):461–482, 1987.
  5. Storage capacity of the hopfield network associative memory. In 2012 Fifth International Conference on Intelligent Computation Technology and Automation, pages 330–336. IEEE, 2012.
  6. Amos Storkey. Increasing the capacity of a hopfield network without sacrificing functionality. In Artificial Neural Networks—ICANN’97: 7th International Conference Lausanne, Switzerland, October 8–10, 1997 Proceeedings 7, pages 451–456. Springer, 1997.
  7. Dense associative memory for pattern recognition. Advances in neural information processing systems, 29, 2016.
  8. On a model of associative memory with huge storage capacity. Journal of Statistical Physics, 168:288–299, 2017.
  9. Large associative memory problem in neurobiology and machine learning. In International Conference on Learning Representations, 2020.
  10. Dynamic properties of neural networks with adapting synapses. Network: Computation in Neural Systems, 3(3):267, 1992.
  11. Hebbian learning of recurrent connections: A geometrical perspective. Neural Computation, 24:2346–2383, 2012.
  12. Self-organization and dynamics reduction in recurrent networks: stimulus presentation and learning. Neural Networks, 11(3):521–533, 1998.
  13. Effects of hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons. Journal of Physiology Paris, 101:136–148, 1 2007.
  14. Mathematical analysis of the effects of hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks. Neural Computation, 20:2937–2966, 2008.
  15. Achieving stable dynamics in neural circuits. PLoS computational biology, 16(8):e1007659, 2020.
  16. Contraction analysis of hopfield neural networks with hebbian learning. In 2022 IEEE 61st Conference on Decision and Control (CDC), pages 622–627, 2022.
  17. The pitchfork bifurcation. International Journal of Bifurcation and Chaos, 27(9):1750132, 2017.
  18. Weakly Connected Neural Networks. Applied Mathematical Sciences. Springer New York, 2012.
  19. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. MIT press, 2005.
  20. Multistability in recurrent neural networks. SIAM Journal on Applied Mathematics, 66(4):1301–1320, 2006.
  21. Mathematical formulations of hebbian learning. Biological Cybernetics, 87:404–415, 2002.
  22. Randall D Beer. On the dynamics of small continuous-time recurrent neural networks. Adaptive Behavior, 3(4):469–509, 1995.
  23. Stepan Karamardian. Fixed Points: Algorithms and Applications. Academic press, 2014.
  24. H.K. Khalil. Nonlinear Systems. Prentice Hall, 1996.
  25. On equilibria, stability, and instability of hopfield neural networks. IEEE Transactions on Neural Networks, 11:534–540, 2000.
  26. Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks. IEEE Transactions on Automatic Control, 47(5):802–807, 2002.
  27. Coexistence and local stability of multiple equilibria in neural networks with piecewise linear nondecreasing activation functions. Neural Networks, 23:189–200, 3 2010.
  28. Yuri Kuznetsov. Elements of Applied Bifurcation Theory, volume 112. Springer, 1998.
  29. Singularities and Groups in Bifurcation Theory, volume 69. Springer Science & Business Media, 2012.
  30. Introduction to Networks of Networks. IOP Publishing, 2022.
  31. Rnns of rnns: Recursive construction of stable assemblies of recurrent neural networks. Advances in Neural Information Processing Systems, 35:30512–30527, 2022.
  32. Meta-learning synaptic plasticity and memory addressing for continual familiarity detection. Neuron, 110(3):544–557, 2022.
  33. S. Wiggins. Introduction to Applied Nonlinear Dynamical Systems and Chaos. Springer New York, 2006.

Summary

We haven't generated a summary for this paper yet.