Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 79 tok/s
Gemini 2.5 Pro 55 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 85 tok/s Pro
GPT OSS 120B 431 tok/s Pro
Kimi K2 186 tok/s Pro
2000 character limit reached

Evidence of Scaling Regimes in the Hopfield Dynamics of Whole Brain Model (2401.07538v3)

Published 15 Jan 2024 in cond-mat.dis-nn and cs.NE

Abstract: It is shown that a Hopfield recurrent neural network exhibits a scaling regime, whose specific exponents depend on the number of parcels used and the decay length of the coupling strength. This scaling regime recovers the picture introduced by Deco et al., according to which the process of information transfer within the human brain shows spatially correlated patterns qualitatively similar to those displayed by turbulent flows, although with a more singular exponent, 1/2 instead of 2/3. Both models employ a coupling strength which decays exponentially with the Euclidean distance between the nodes, informed by experimentally derived brain topology. Nevertheless, their mathematical nature is very different, Hopf oscillators versus a Hopfield neural network, respectively. Hence, their convergence for the same data parameters, suggests an intriguing robustness of the scaling picture.Furthermore, the present analysis shows that the Hopfield model brain remains functional by removing links above about five decay lengths, corresponding to about one sixth of the size of the global brain. This suggests that, in terms of connectivity decay length, the Hopfield brain functions in a sort of intermediate ``turbulent liquid''-like state, whose essential connections are the intermediate ones between the connectivity decay length and the global brain size. The evident sensitivity of the scaling exponent to the value of the decay length, as well as to the number of brain parcels employed, leads us to take with great caution any quantitative assessment regarding the specific nature of the scaling regime.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. G. Deco and M. L. Kringelbach, “Turbulent-like dynamics in the human brain,” Cell Reports, vol. 33, p. 108471, 12 2020.
  2. S. I. Amari, “Learning patterns and pattern sequences by self-organizing nets of threshold elements,” IEEE Transactions on Computers, vol. C-21, pp. 1197–1206, 1972.
  3. W. A. Little, “The existence of persistent states in the brain,” Mathematical Biosciences, vol. 19, pp. 101–120, 2 1974.
  4. J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities.,” Proceedings of the National Academy of Sciences of the United States of America, vol. 79, pp. 2554–2558, apr 1982.
  5. D. J. Amit, H. Gutfreund, and H. Sompolinsky, “Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural Networks,” Physical Review Letters, vol. 55, pp. 1530–1533, sep 1985.
  6. N. Brunel, “Is cortical connectivity optimized for storing information?,” Nature Neuroscience, vol. 19, pp. 749–755, may 2016.
  7. C. J. Hillar and N. M. Tran, “Robust Exponential Memory in Hopfield Networks,” Journal of Mathematical Neuroscience, vol. 8, no. 1, 2018.
  8. C. Hillar, T. Chan, R. Taubman, and D. Rolnick, “Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks,” Entropy 2021, Vol. 23, Page 1494, vol. 23, p. 1494, nov 2021.
  9. D. Stauffer, A. Aharony, L. D. F. Costa, and J. Adler, “Efficient hopfield pattern recognition on a scale-free neural network,” The European Physical Journal B - Condensed Matter and Complex Systems 2003 32:3, vol. 32, pp. 395–399, 2003.
  10. D.-H. Kim, J. Park, and B. Kahng, “Enhanced storage capacity with errors in scale-free hopfield neural networks: An analytical study,” PLOS ONE, vol. 12, p. e0184683, 10 2017.
  11. V. Folli, G. Gosti, M. Leonetti, and G. Ruocco, “Effect of dilution in asymmetric recurrent neural networks,” Neural Networks, vol. 104, pp. 50–59, 8 2018.
  12. M. Leonetti, V. Folli, E. Milanetti, G. Ruocco, and G. Gosti, “Network dilution and asymmetry in an efficient brain,” Philosophical Magazine, vol. 100, pp. 2544–2555, oct 2020.
  13. K. Gopalsamy and X. zhong He, “Stability in asymmetric hopfield nets with transmission delays,” Physica D: Nonlinear Phenomena, vol. 76, pp. 344–358, 9 1994.
  14. Z.-B. Xu, G.-Q. Hu, and C.-P. Kwong, “Asymmetric hopfield-type networks: Theory and applications,” Neural Networks, vol. 9, pp. 483–501, 4 1996.
  15. T. Chen and S. I. Amari, “Stability of asymmetric hopfield networks,” IEEE Transactions on Neural Networks, vol. 12, pp. 159–163, 1 2001.
  16. F. M. Franca and Z. Yang, “Building artificial cpgs with asymmetric hopfield networks,” Proceedings of the International Joint Conference on Neural Networks, vol. 4, pp. 290–295, 2000.
  17. P. Zheng, J. Zhang, and W. Tang, “Analysis and design of asymmetric hopfield networks with discrete-time dynamics,” Biological Cybernetics, vol. 103, pp. 79–85, 7 2010.
  18. A. Szedlak, G. Paternostro, and C. Piermarocchi, “Control of asymmetric hopfield networks and application to cancer attractors,” PLOS ONE, vol. 9, p. e105842, 8 2014.
  19. G. Gosti, E. Milanetti, V. Folli, F. de Pasquale, M. Leonetti, M. Corbetta, G. Ruocco, and S. Della Penna, “A recurrent hopfield network for estimating meso-scale effective connectivity in meg,” Neural Networks, vol. 170, pp. 72–93, 2024.
  20. D. O. Hebb, The Organization of Behavior; A Neuropsychological Theory. Wiley, 10 1949.
  21. A. Fachechi, E. Agliari, and A. Barra, “Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones,” Neural Networks, vol. 112, 2019.
  22. M. Benedetti, E. Ventura, E. Marinari, G. Ruocco, and F. Zamponi, “Supervised perceptron learning vs unsupervised hebbian unlearning: Approaching optimal memory retrieval in hopfield-like networks,” Journal of Chemical Physics, vol. 156, 2022.
  23. M. Benedetti and E. Ventura, “Training neural networks with structured noise improves classification and generalization,” arXiv, vol. 2302.13417v4, 2 2023.
  24. G. Gosti, V. Folli, M. Leonetti, and G. Ruocco, “Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks,” Entropy, vol. 21, p. 726, jul 2019.
  25. S. Hwang, V. Folli, E. Lanza, G. Parisi, G. Ruocco, and F. Zamponi, “On the number of limit cycles in asymmetric neural networks,” Journal of Statistical Mechanics: Theory and Experiment, vol. 2019, p. 053402, 5 2019.
  26. S. Hwang, E. Lanza, G. Parisi, J. Rocchi, G. Ruocco, and F. Zamponi, “On the number of limit cycles in diluted neural networks,” Journal of Statistical Physics, vol. 181, pp. 2304–2321, 12 2020.
  27. U. Frisch, Turbulence. Cambridge University Press, 11 1995.
  28. R. Benzi, S. Ciliberto, R. Tripiccione, C. Baudet, F. Massaioli, and S. Succi, “Extended self-similarity in turbulent flows,” Physical Review E, vol. 48, p. R29, 7 1993.
  29. A. Schaefer, R. Kong, E. M. Gordon, T. O. Laumann, X.-N. Zuo, A. J. Holmes, S. B. Eickhoff, and B. T. T. Yeo, “Local-global parcellation of the human cerebral cortex from intrinsic functional connectivity mri,” Cerebral Cortex, vol. 28, pp. 3095–3114, 9 2018.
  30. V. Folli, M. Leonetti, and G. Ruocco, “On the Maximum Storage Capacity of the Hopfield Model,” Frontiers in Computational Neuroscience, vol. 10, p. 144, jan 2017.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.