Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 150 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 31 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 105 tok/s Pro
Kimi K2 185 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Biological computations: limitations of attractor-based formalisms and the need for transients (2404.10369v1)

Published 16 Apr 2024 in q-bio.OT

Abstract: Living systems, from single cells to higher vertebrates, receive a continuous stream of non-stationary inputs that they sense, e.g., via cell surface receptors or sensory organs. Integrating these time-varying, multi-sensory, and often noisy information with memory using complex molecular or neuronal networks, they generate a variety of responses beyond simple stimulus-response association, including avoidance behavior, life-long-learning or social interactions. In a broad sense, these processes can be understood as a type of biological computation. Taking as a basis generic features of biological computations, such as real-time responsiveness or robustness and flexibility of the computation, we highlight the limitations of the current attractor-based framework for understanding computations in biological systems. We argue that frameworks based on transient dynamics away from attractors are better suited for the description of computations performed by neuronal and signaling networks. In particular, we discuss how quasi-stable transient dynamics from ghost states that emerge at criticality have a promising potential for developing an integrated framework of computations, that can help us understand how living system actively process information and learn from their continuously changing environment.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (85)
  1. Six distinct NFκ𝜅\kappaitalic_κb signaling codons convey discrete information to distinguish stimuli and enable appropriate macrophage responses. Immunity, 54(5):916–930.e7, 2021.
  2. Associative mechanisms allow for social learning and cultural transmission of string pulling in an insect. PLOS Biology, 14(10):e1002564, 2016.
  3. U. Alon. An introduction to systems biology: design principles of biological circuits. CRC press, 2019.
  4. P. Ashwin and J. Borresen. Discrete computation using a perturbed heteroclinic network. Physics Letters A, 347(4-6):208–214, 2005.
  5. P. Ashwin and M. Timme. When instability makes sense. Nature, 436(7047):36–37, 2005.
  6. Slower prefrontal metastable dynamics during deliberation predicts error trials in a distance discrimination task. Cell Reports, 35(1):108934, 2021.
  7. G. Bondanelli and S. Ostojic. Coding with transient trajectories in recurrent neural networks. PLOS Computational Biology, 16(2):e1007655, 2020.
  8. A. Buttenschön and L. Edelstein-Keshet. Cell repolarization: A bifurcation study of spatio-temporal perturbations of polar cells. Bulletin of Mathematical Biology, 84(10), 2022.
  9. Soft-wired long-term memory in a natural recurrent neuronal network. Chaos: An Interdisciplinary Journal of Nonlinear Science, 30(6), 2020.
  10. D. Chialvo. Emergent complex neural dynamics. Nat. Phys., 6:744, 2010.
  11. C. Choudhary and M. Mann. Decoding signalling networks by mass spectrometry-based proteomics. Nature Reviews Molecular Cell Biology, 11(6):427–439, 2010.
  12. A. Dancin. Bacteria as computers making computers. FEMS Microbiol. Rev., 33:3–26, 2009.
  13. M. Davis. What is a computation? In: Mathematics Today Twelve Informal Essays, In: Lynn Arthur Steen (ed.), Mathematics Today Twelve Informal Essays, 1978.
  14. A complex hierarchy of avoidance behaviors in a single-cell eukaryote. Current Biology, 29(24):4323–4329.e2, 2019.
  15. A. Dussutour. Learning in single cell organisms. Biochemical and Biophysical Research Communications, 564:92–102, 2021.
  16. Gene regulatory networks and their applications: understanding biological and medical problems in terms of networks. Frontiers in Cell and Developmental Biology, 2:38, 2014.
  17. Novel generic models for differentiating stem cells reveal oscillatory mechanisms. Journal of The Royal Society Interface, 18(183), 2021.
  18. W. T. Fitch. Information and the single cell. Current Opinion in Neurobiology, 71:150–157, 2021.
  19. Integrating conflicting chemotactic signals: The role of memory in leukocyte navigation. J of Cell Biol., 147(3):577–588, 1999.
  20. K. J. Friston. Transients, metastability, and neuronal dynamics. NeuroImage, 5(2):164–171, 1997.
  21. Reconsidering the evidence for learning in single cells. eLife, 10:e61907, 2021.
  22. A. N. Gorban. Singularities of transition processes in dynamical systems: Qualitative theory of critical delays. Electronic Journal of Differential Equations, 1(Mon. 05), 2004.
  23. J. Gunawardena. Learning outside the brain: integrating cognitive science and systems biology. Proceedings of the IEEE, 110(5):590–612, 2022.
  24. R. M. Harris-Warrick and E. Marder. Modulation of neural networks for behavior. Annual review of neuroscience, 14(1):39–57, 1991.
  25. D. O. Hebb. The organization of behavior: A neuropsychological theory. Psychology press, 2005.
  26. M. W. Hirsch and B. Baird. Computing with dynamic attractors in neural networks. Biosystems, 34(1-3):173–195, 1995.
  27. J. J. Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences, 79(8):2554–2558, 1982.
  28. Maxwell-bloch turbulence. Prog. Theor. Phys. Suppl., 99:295, 1989.
  29. Designing spontaneous behavioral switching via chaotic itinerancy. Science Advances, 6(46), 2020.
  30. Distinguishing mechanisms underlying EMT tristability. Cancer Convergence, 1:2, 2017.
  31. The EGFR phosphatase RPTPγ𝛾\gammaitalic_γ is a redox-regulated suppressor of promigratory signaling. The EMBO Journal, 42(10), 2023.
  32. Geometric models for robust encoding of dynamical information into embryonic patterns. eLife, 9:e55778, 2020.
  33. K. Kaneko. Clustering, coding, switching, hierarchical ordering, and control in network of chaotic elements. Physica D, 41:137–172, 1990.
  34. K. Kaneko. Globally coupled circle maps. Physica D, 54:5–19, 1991.
  35. K. Kaneko and I. Tsuda. Chaotic itinerancy. Chaos: An Interdisciplinary Journal of Nonlinear Science, 13(3):926–936, 2003.
  36. Epigenetic inheritance of gene silencing is maintained by a self-tuning mechanism based on resource competition. Cell Systems, 14(1):24–40.e11, 2023.
  37. Global brain dynamics embed the motor command sequence of caenorhabditis elegans. Cell, 163(3):656–669, 2015.
  38. Cyclical fate restriction: a new view of neural crest cell fate specification. Development, 148(22), 2021.
  39. R. Kim and T. J. Sejnowski. Strong inhibitory signaling underlies stable temporal dynamics and working memory in spiking neural networks. Nature Neuroscience, 24(1):129–139, 2020.
  40. Beyond fixed points: transient quasi-stable dynamics emerging from ghost channels and ghost cycles. arXiv, 2309.17201, 2023.
  41. Dynamics of a mutual inhibition circuit between pyramidal neurons compared to human perceptual competition. The Journal of Neuroscience, 41(6):1251–1264, 2020.
  42. Dissipative structures, machines, and organisms: A perspective. CHAOS, 27:104607, 2017.
  43. D. Krishnamurthy and M. Prakash. Emergent programmable behavior and chaos in dynamically driven active filaments. PNAS, 120(28):e2304981120, 2023.
  44. R. Legenstein and W. Maass. What makes a dynamical system computationally powerful. New directions in statistical signal processing: from systems to brain (ed. SS Haykin). Cambridge, MA: MIT Press, 2007.
  45. Dynamical synapses causing self-organized criticality in neural networks. Nature Physics, 3(12):857–860, 2007.
  46. Neutrophil swarms require LTB4 and integrins at sites of cell death in vivo. Nature, 498(7454):371–375, 2013.
  47. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation, 14(11):2531–2560, 2002.
  48. B. J. MacLennan. Natural computation and non-turing models of computation. Theoretical Computer Science, 317(1-3):115–145, 2004.
  49. Canalization of gene expression and domain shifts in the drosophila blastoderm by dynamical attractors. PLoS Computational Biology, 5(3):e1000303, 2009.
  50. O. Mazor and G. Laurent. Transient dynamics versus fixed points in odor representations by locust antennal lobe projection neurons. Neuron, 48(4):661–673, 2005.
  51. J. Milnor. On the concept of attractor. Comm. Math. Phys., 99(2):177–195, 1985.
  52. T. Mora and W. Bialek. Are biological systems poised at criticality? J. Stat. Phys., 144:268–302, 2011.
  53. Cells use molecular working memory to navigate in changing chemoattractant fields. eLife, 11:e76825, 2022.
  54. A. Nandan and A. Koseska. Non-asymptotic transients away from steady states determine cellular responsiveness to dynamic spatial-temporal signals. PLOS Computational Biology, 19(8):e1011388, 2023.
  55. F. Neves and M. Timme. Computation by switching in complex networks of states. Phys. Rev. Lett., 109:018701, 2012.
  56. F. Neves and M. Timme. Bio-inspired computing by nonlinear network dynamics—a brief introduction. Journal of Physics: Complexity, 2(4):045019, 2021.
  57. D. J. Nicholson. Is the cell really a machine? J Theor Biol., 21(477):108–126, 2019.
  58. D. Papatsenko and M. Levine. The drosophila gap gene network is composed of two parallel toggle switches. PLoS ONE, 6(7):e21145, 2011.
  59. Quantitative landscapes reveal trajectories of cell-state transitions associated with drug resistance in melanoma. iScience, 25(12):105499, 2022.
  60. Dynamical encoding by networks of competing neuron groups: Winnerless competition. Physical Review Letters, 87(6), 2001.
  61. Metastable attractors explain the variable timing of stable behavioral action sequences. Neuron, 110(1):139–153.e9, 2022.
  62. How organisms come to know the world: Fundamental limits on artificial general intelligence. Frontiers in Ecology and Evolution, 9:806283, 2022.
  63. Control of cell state transitions. Nature, 609(7929):975–985, 2022.
  64. Frequency modulation of erk activation dynamics rewires cell fate. Molecular Systems Biology, 11(11):838, 2015.
  65. Control of lateral migration and germ cell elimination by the emphDrosophila melanogaster lipid phosphate phosphatases wunen and wunen 2. The Journal of Cell Biology, 171(4):675–683, 2005.
  66. Growth factor-induced MAPK network topology shapes erk response determining PC-12 cell fate. Nature Cell Biology, 9(3):324–330, 2007.
  67. J. Sardanyés and R. V. Solé. Delayed transitions in non-linear replicator networks: About ghosts and hypercycles. Chaos, Solitons and Fractals, 31(2):305–315, 2007.
  68. L. F. Seoane. Evolutionary aspects of reservoir computing. Philosophical Transactions of the Royal Society B: Biological Sciences, 374(1774):20180377, 2019.
  69. M. Sheetz. The cell as a machine. Cambridge University Press, 2018.
  70. S. Smale. On gradient dynamical systems. Annals of Mathematics, 74(1):199–206, 1961.
  71. R. Solé and L. F. Seoane. Evolution of brains and computers: The roads not taken. Entropy, 24(5):665, 2022.
  72. Interdependence between EGFR and phosphatases spatially established by vesicular dynamics generates a growth factor sensing and responding network. Cell Systems, 7(3):295–309.e11, 2018.
  73. Organization at criticality enables processing of time-varying signals by receptor networks. Molecular Systems Biology, 16(2):e8870, 2020.
  74. S. H. Strogatz. Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry and Engineering. Westview Press, 2000.
  75. D. Sussillo and O. Barak. Opening the black box: Low-dimensional dynamics in high-dimensional recurrent neural networks. Neural Computation, 25(3):626–649, 2013.
  76. Prevalence of unstable attractors in networks of pulse-coupled oscillators. Phys. Rev. Lett., 89:154105, 2002.
  77. I. Tsuda. Chaotic itinerancy as a dynamical basis of hermeneutics in brain and mind. Microcomputers and Attention, World Futures, 32:167, 1991.
  78. I. Tsuda. Dynamic link of memory-chaotic memory map in non equilibrium neural networks. Neural Networks, 5:313, 1992.
  79. I. Tsuda. Chaotic itinerancy and its roles in cognitive neurodynamics. Current Opinion in Neurobiology, 31:67–71, 2015.
  80. A. M. Turing. Systems of logic based on ordinals. Proceedings of the London Mathematical Society, Series 2, 45:161–228, 1939.
  81. A. M. Turing. The chemical basis of morphogenesis. Bulletin of mathematical biology, 52:153–197, 1990.
  82. A. M. Turing. Intelligent Machinery (1948), in: The Essential Turing. Oxford University Press, 2004.
  83. Classification of transient behaviours in a time-dependent toggle switch model. BMC Systems Biology, 8:1–19, 2013.
  84. Cell commitment motif composed of progenitor-specific transcription factors and mutual-inhibition regulation. IET Systems Biology, 8(4):129–137, 2014.
  85. The dynamics of pattern matching in camouflaging cuttlefish. Nature, 619(7968):122–128, 2023.
Citations (4)

Summary

  • The paper demonstrates that traditional attractor-based models are insufficient for capturing the dynamic adaptability required by biological systems.
  • It employs the EGFR network example to illustrate how stable states fail to accommodate continuous, real-time environmental fluctuations.
  • The study advocates for transient dynamics, highlighting ghost states as a promising approach for mimicking flexible, adaptive cellular responses.

Analyzing the Limitations of Attractor-Based Formalisms in Biological Computations

The paper "Biological computations: limitations of attractor-based formalisms and the need for transients" systematically examines the inherent complexities in biological computations that transcend the traditional attractor-based frameworks typically used in computational models. The authors, Koch et al., posit that since living organisms engage with continuous, non-stationary inputs, they require a computational understanding that accommodates real-time responsiveness and dynamic adaptability.

Limitations of Attractor-Based Models

The attractor-based model, primarily inspired by theories from mathematical and computational disciplines, hinges on the notion that systems converge to stable states when subjected to perturbations. Current understanding, extrapolated from machine computations, suggests that these states, or attractors, are sufficient for modeling cognitive processes and cellular responses. This paper challenges this notion, pointing out that such frameworks fail under conditions where biological systems must continuously adapt to fluctuating environments, such as those seen in immune responses or behavioral adaptations in animals.

Evidence Against Attractor Models in Cellular Computation

Using the example of the epidermal growth factor receptor (EGFR) network, the paper illustrates the shortcomings of attractor-based computation. Cells responding to spatiotemporal signals encounter limitations when the system's dynamics are described solely by stable states due to the inherent need for real-time adaptation. In scenarios requiring responsiveness to temporal variations, attractor-based models fall short as they do not accommodate the flexibility necessary to integrate evolving environmental cues effectively.

Transients and Quasi-Stable Dynamics

Koch et al. advocate for a transient-dynamics-based approach, where quasi-stable states that emerge at criticality provide a better model for biological processes. The concept of ghost states, as discussed in the paper, offers one such promising avenue. These states, characterized by temporary stability, facilitate memory retention and real-time processing, maintaining responsiveness without requiring a system to reside indefinitely in a defined attractor.

Transients, particularly those described by ghost states, enable biological computations to adapt and process information dynamically. This model offers enhanced flexibility, evidenced by its ability to process signals with varying frequencies or interruptions—an advantage in realistic biological settings that require immediate and adaptable responses.

Future Research Directions

The paper underscores the potential of transient-based computational approaches in forming a cohesive framework applicable to both neural and cellular systems across different scales. Emphasizing the need for further studies into these transient mechanisms, it suggests that future research should focus on elucidating the geometric and dynamical properties that could unify our understanding of natural computations. An integrated framework might offer insights into unresolved questions about anticipatory processes, adaptability, and lifelong learning in biological entities.

In conclusion, this paper calls for a paradigm shift from traditional attractor-based models to transient dynamics, highlighting their promise in mirroring the nuanced, real-time adaptability seen in living systems. By exploring how transient dynamics can better simulate the essential features of biological computation, this research paves the way for innovative computational models that more closely align with the complex nature of life.

Dice Question Streamline Icon: https://streamlinehq.com
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 4 tweets and received 325 likes.

Upgrade to Pro to view all of the tweets about this paper: