Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transitions among metastable states underlie context-dependent working memories in a multiple timescale network (2104.10829v1)

Published 22 Apr 2021 in physics.bio-ph and q-bio.NC

Abstract: Transitions between metastable states are commonly observed in the neural system and underlie various cognitive functions such as working memory. In a previous study, we have developed a neural network model with the slow and fast populations, wherein simple Hebb-type learning enables stable and complex (e.g., non-Markov) transitions between neural states. This model is distinct from a network with asymmetric Hebbian connectivity and a network trained with supervised machine learning methods: the former generates simple Markov sequences. The latter generates complex but vulnerable sequences against perturbation and its learning methods are biologically implausible. By using our model, we propose and demonstrate a novel mechanism underlying stable working memories: sequentially stabilizing and destabilizing task-related states in the fast neural dynamics. The slow dynamics maintain a history of the applied inputs, e.g., context signals, and enable the task-related states to be stabilized in a context-dependent manner. We found that only a single (or a few) state(s) is stabilized in each epoch (i.e., a period in the presence of the context signal and a delayed period) in a working memory task, resulting in a robust performance against noise and change in a task protocol. These results suggest a simple mechanism underlying complex and stable processing in neural systems.

Summary

We haven't generated a summary for this paper yet.