Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 71 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 21 tok/s Pro
GPT-4o 81 tok/s Pro
Kimi K2 231 tok/s Pro
GPT OSS 120B 435 tok/s Pro
Claude Sonnet 4 33 tok/s Pro
2000 character limit reached

Sequential attractors in combinatorial threshold-linear networks (2107.10244v4)

Published 21 Jul 2021 in q-bio.NC

Abstract: Sequences of neural activity arise in many brain areas, including cortex, hippocampus, and central pattern generator circuits that underlie rhythmic behaviors like locomotion. While network architectures supporting sequence generation vary considerably, a common feature is an abundance of inhibition. In this work, we focus on architectures that support sequential activity in recurrently connected networks with inhibition-dominated dynamics. Specifically, we study emergent sequences in a special family of threshold-linear networks, called combinatorial threshold-linear networks (CTLNs), whose connectivity matrices are defined from directed graphs. Such networks naturally give rise to an abundance of sequences whose dynamics are tightly connected to the underlying graph. We find that architectures based on generalizations of cycle graphs produce limit cycle attractors that can be activated to generate transient or persistent (repeating) sequences. Each architecture type gives rise to an infinite family of graphs that can be built from arbitrary component subgraphs. Moreover, we prove a number of graph rules for the corresponding CTLNs in each family. The graph rules allow us to strongly constrain, and in some cases fully determine, the fixed points of the network in terms of the fixed points of the component subnetworks. Finally, we also show how the structure of certain architectures gives insight into the sequential dynamics of the corresponding attractor.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.