Structure of activity in multiregion recurrent neural networks (2402.12188v2)
Abstract: Neural circuits are composed of multiple regions, each with rich dynamics and engaging in communication with other regions. The combination of local, within-region dynamics and global, network-level dynamics is thought to provide computational flexibility. However, the nature of such multiregion dynamics and the underlying synaptic connectivity patterns remain poorly understood. Here, we study the dynamics of recurrent neural networks with multiple interconnected regions. Within each region, neurons have a combination of random and structured recurrent connections. Motivated by experimental evidence of communication subspaces between cortical areas, these networks have low-rank connectivity between regions, enabling selective routing of activity. These networks exhibit two interacting forms of dynamics: high-dimensional fluctuations within regions and low-dimensional signal transmission between regions. To characterize this interaction, we develop a dynamical mean-field theory to analyze such networks in the limit where each region contains infinitely many neurons, with cross-region currents as key order parameters. Regions can act as both generators and transmitters of activity, roles that we show are in conflict. Specifically, taming the complexity of activity within a region is necessary for it to route signals to and from other regions. Unlike previous models of routing in neural circuits, which suppressed the activities of neuronal groups to control signal flow, routing in our model is achieved by exciting different high-dimensional activity patterns through a combination of connectivity structure and nonlinear recurrent dynamics. This theory provides insight into the interpretation of both multiregion neural data and trained neural networks.
- D. J. Felleman and D. C. Van Essen, Distributed hierarchical processing in the primate cerebral cortex., Cerebral cortex (New York, NY: 1991) 1, 1 (1991).
- G. Okazawa and R. Kiani, Neural Mechanisms That Make Perceptual Decisions Flexible (2023).
- C. Fang and K. L. Stachenfeld, Predictive auxiliary objectives in deep rl mimic learning in the brain, arXiv preprint arXiv:2310.06089 (2023).
- T. A. Machado, I. V. Kauvar, and K. Deisseroth, Multiregion neuronal activity: the forest and the trees (2022).
- M. G. Perich and K. Rajan, Rethinking brain-wide interactions through multi-region ‘network of networks’ models, Current opinion in neurobiology 65, 146 (2020).
- U. Pereira-Obilinovic, J. Aljadeff, and N. Brunel, Forgetting Leads to Chaos in Attractor Networks, Physical Review X 13, 10.1103/PhysRevX.13.011009 (2023), arXiv:2112.00119 .
- K. Rajan, L. Abbott, and H. Sompolinsky, Stimulus-dependent suppression of chaos in recurrent neural networks, Physical review e 82, 011903 (2010).
- L. Abbott, Where are the switches on this thing, 23 Problems in systems neuroscience , 423 (2006).
- A. Cichocki, Tensor networks for big data analytics and large-scale optimization problems, arXiv preprint arXiv:1407.3124 (2014).
- J. C. Bridgeman and C. T. Chubb, Hand-waving and interpretive dance: an introductory course on tensor networks, Journal of physics A: Mathematical and theoretical 50, 223001 (2017).
- H. Sompolinsky, A. Crisanti, and H.-J. Sommers, Chaos in random neural networks, Physical review letters 61, 259 (1988).
- J. Aljadeff, M. Stern, and T. Sharpee, Transition to chaos in random networks with cell-type-specific connectivity, Physical review letters 114, 088101 (2015).
- R. Srinath, D. A. Ruff, and M. R. Cohen, Attention improves information flow between neuronal populations without changing the communication subspace, Current Biology 31, 5299 (2021).
- A. M. Turing, The chemical basis of morphogenesis, Bulletin of mathematical biology 52, 153 (1990).
- J. P. Cunningham and B. M. Yu, Dimensionality reduction for large-scale neural recordings, Nature neuroscience 17, 1500 (2014).
- J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities., Proceedings of the national academy of sciences 79, 2554 (1982).
- R. Ben-Yishai, R. L. Bar-Or, and H. Sompolinsky, Theory of orientation tuning in visual cortex., Proceedings of the National Academy of Sciences 92, 3844 (1995).
- Y. Burak and I. R. Fiete, Accurate path integration in continuous attractor network models of grid cells, PLoS computational biology 5, e1000291 (2009).
- K. Fukuda, cdd/cdd+ reference manual, Institute for Operations Research, ETH-Zentrum , 91 (1997).
- L. Van der Maaten and G. Hinton, Visualizing data using t-sne., Journal of machine learning research 9 (2008).
- A. Valente, J. Pillow, and S. Ostojic, Extracting computational mechanisms from neural activity with low-rank networks, Neur Inf Proc Sys (2022).
- L. Cimeša, L. Ciric, and S. Ostojic, Geometry of population activity in spiking networks with low-rank structure, PLoS Computational Biology 19, e1011315 (2023).
- Y. Shao and S. Ostojic, Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks, PLOS Computational Biology 19, e1010855 (2023).