Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 87 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 17 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 102 tok/s Pro
Kimi K2 166 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Attractor-based models for sequences and pattern generation in neural circuits (2410.11012v1)

Published 14 Oct 2024 in q-bio.NC

Abstract: Neural circuits in the brain perform a variety of essential functions, including input classification, pattern completion, and the generation of rhythms and oscillations that support processes such as breathing and locomotion. There is also substantial evidence that the brain encodes memories and processes information via sequences of neural activity. In this dissertation, we are focused on the general problem of how neural circuits encode rhythmic activity, as in central pattern generators (CPGs), as well as the encoding of sequences. Traditionally, rhythmic activity and CPGs have been modeled using coupled oscillators. Here we take a different approach, and present models for several different neural functions using threshold-linear networks. Our approach aims to unify attractor-based models (e.g., Hopfield networks) which encode static and dynamic patterns as attractors of the network. In the first half of this dissertation, we present several attractor-based models. These include: a network that can count the number of external inputs it receives; two models for locomotion, one encoding five different quadruped gaits and another encoding the orientation system of a swimming mollusk; and, finally, a model that connects the fixed point sequences with locomotion attractors to obtain a network that steps through a sequence of dynamic attractors. In the second half of the thesis, we present new theoretical results, some of which have already been published. There, we established conditions on network architectures to produce sequential attractors. Here we also include several new theorems relating the fixed points of composite networks to those of their component subnetworks, as well as a new architecture for layering networks which produces "fusion" attractors by minimizing interference between the attractors of individual layers.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 69 likes.