Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Approximating nonlinear functions with latent boundaries in low-rank excitatory-inhibitory spiking networks (2307.09334v3)

Published 18 Jul 2023 in q-bio.NC and cs.NE

Abstract: Deep feedforward and recurrent rate-based neural networks have become successful functional models of the brain, but they neglect obvious biological details such as spikes and Dale's law. Here we argue that these details are crucial in order to understand how real neural circuits operate. Towards this aim, we put forth a new framework for spike-based computation in low-rank excitatory-inhibitory spiking networks. By considering populations with rank-1 connectivity, we cast each neuron's spiking threshold as a boundary in a low-dimensional input-output space. We then show how the combined thresholds of a population of inhibitory neurons form a stable boundary in this space, and those of a population of excitatory neurons form an unstable boundary. Combining the two boundaries results in a rank-2 excitatory-inhibitory (EI) network with inhibition-stabilized dynamics at the intersection of the two boundaries. The computation of the resulting networks can be understood as the difference of two convex functions and is thereby capable of approximating arbitrary non-linear input-output mappings. We demonstrate several properties of these networks, including noise suppression and amplification, irregular activity and synaptic balance, as well as how they relate to rate network dynamics in the limit that the boundary becomes soft. Finally, while our work focuses on small networks (5-50 neurons), we discuss potential avenues for scaling up to much larger networks. Overall, our work proposes a new perspective on spiking networks that may serve as a starting point for a mechanistic understanding of biological spike-based computation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (106)
  1. Building functional networks of spiking model neurons. Nature neuroscience, 19(3):350–355.
  2. What is the dynamical regime of cerebral cortex? Neuron, 109(21):3373–3391.
  3. Analysis of the stabilized supralinear network. Neural computation, 25(8):1994–2037.
  4. Learning nonlinear dynamics in efficient, balanced spiking networks using local plasticity rules. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32.
  5. Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex. Cerebral cortex (New York, NY: 1991), 7(3):237–252.
  6. Optnet: Differentiable optimization as a layer in neural networks. In International Conference on Machine Learning, pages 136–145. PMLR.
  7. On difference convexity of locally lipschitz functions. Optimization, 60(8-9):961–978.
  8. Nonlinear stimulus representations in neural circuits with approximate excitatory-inhibitory balance. PLoS computational biology, 16(9):e1008192.
  9. Barak, O. (2017). Recurrent neural networks as versatile tools of neuroscience research. Current opinion in neurobiology, 46:1–6.
  10. Firing rate predictions in optimal balanced networks. Advances in Neural Information Processing Systems, 26.
  11. Optimal compensation for neuron loss. Elife, 5:e12454.
  12. Long short-term memory and learning-to-learn in networks of spiking neurons. Advances in neural information processing systems, 31.
  13. How to incorporate biological insights into network models and why it matters. The Journal of Physiology.
  14. Predictive coding of dynamical variables in balanced spiking networks. PLoS computational biology, 9(11):e1003258.
  15. Convex optimization. Cambridge university press.
  16. Learning to represent signals spike by spike. PLoS computational biology, 16(3):e1007692.
  17. Brunel, N. (2000). Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. Journal of computational neuroscience, 8:183–208.
  18. The geometry of robustness in spiking neural networks. Elife, 11:e73276.
  19. Neural oscillations as a signature of efficient coding in the presence of synaptic delays. Elife, 5:e13824.
  20. Neural population geometry: An approach for understanding biological and artificial neural networks. Current opinion in neurobiology, 70:137–144.
  21. Geometry of population activity in spiking networks with low-rank structure. PLoS Computational Biology, 19(8):e1011315.
  22. Learning to live with dale’s principle: Anns with separate excitatory and inhibitory units. bioRxiv, pages 2020–11.
  23. Dimensionality reduction for large-scale neural recordings. Nature neuroscience, 17(11):1500–1509.
  24. Theoretical neuroscience: computational and mathematical modeling of neural systems. MIT press.
  25. Efficient codes and balanced networks. Nature neuroscience, 19(3):375–382.
  26. The centrality of population-level factors to network computation is demonstrated by a versatile approach for training spiking networks. Neuron.
  27. The role of population structure in computations through neural dynamics. Nature neuroscience, 25(6):783–794.
  28. Eccles, J. C. (1976). From electrical to chemical transmission in the central nervous system: the closing address of the sir henry dale centennial symposium cambridge, 19 september 1975. Notes and records of the Royal Society of London, 30(2):219–230.
  29. Eliasmith, C. (2005). A unified approach to building and controlling spiking attractor networks. Neural computation, 17(6):1276–1314.
  30. Neural engineering: Computation, representation, and dynamics in neurobiological systems. MIT press.
  31. The use and abuse of large-scale brain models. Current opinion in neurobiology, 25:1–6.
  32. Dynamics of the firing probability of noisy integrate-and-fire neurons. Neural computation, 14(9):2057–2110.
  33. Perisomatic inhibition. Neuron, 56(1):33–42.
  34. Why neurons mix: high dimensionality for higher cognition. Current opinion in neurobiology, 37:66–74.
  35. Neural manifolds for the control of movement. Neuron, 94(5):978–984.
  36. Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press.
  37. Deep declarative networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(8):3988–4004.
  38. The computational and learning benefits of daleian neural networks. arXiv preprint arXiv:2210.05961.
  39. Neocortical network activity in vivo is generated through a dynamic balance of excitation and inhibition. Journal of Neuroscience, 26(17):4535–4545.
  40. Short-term plasticity explains irregular persistent activity in working memory tasks. Journal of Neuroscience, 33(1):133–149.
  41. How noise contributes to contrast invariance of orientation tuning in cat visual cortex. Journal of Neuroscience, 22(12):5118–5128.
  42. Hartman, P. (1959). On functions representable as a difference of convex functions. Pacific Journal of Mathematics, 9(3):707–713.
  43. The dynamical regime of sensory cortex: stable dynamics around a single stimulus-tuned attractor account for patterns of noise variability. Neuron, 98(4):846–860.
  44. Optimal control of transient dynamics in balanced networks supports generation of complex movements. Neuron, 82(6):1394–1406.
  45. Dc programming: overview. Journal of Optimization Theory and Applications, 103:1–43.
  46. Circuit models of low-dimensional shared variability in cortical networks. Neuron, 101(2):337–348.
  47. Spiking deep networks with lif neurons. arXiv preprint arXiv:1510.08829.
  48. Training dynamically balanced excitatory-inhibitory networks. PloS one, 14(8):e0220547.
  49. Izhikevich, E. M. (2007). Dynamical systems in neuroscience. MIT press.
  50. Interpreting neural computations by examining intrinsic and embedding dimensionality of neural activity. Current opinion in neurobiology, 70:113–120.
  51. Predictive coding in balanced neural networks with noise, chaos and delays. Advances in neural information processing systems, 33:16677–16688.
  52. Neuroscience out of control: control-theoretic perspectives on neural circuit dynamics. Current opinion in neurobiology, 58:122–129.
  53. Decoding and encoding (de)mixed population responses. Current Opinion in Neurobiology, 58:112–121.
  54. Training spiking neural networks in the strong coupling regime. Neural computation, 33(5):1199–1233.
  55. Biologically plausible solutions for spiking networks with efficient coding. Advances in neural information processing systems, 35:20607–20620.
  56. Piecewise affine functions as a difference of two convex functions. Optimization, 18(1):23–29.
  57. Encoding in balanced networks: Revisiting spike patterns and chaos in stimulus-driven systems. PLoS computational biology, 12(12):e1005258.
  58. Coherent chaos in a recurrent neural network with structured connectivity. PLoS computational biology, 14(12):e1006309.
  59. Macroscopic fluctuations emerge in balanced networks with incomplete recurrent alignment. Physical Review Research, 3(2):023171.
  60. A unifying perspective on neural manifolds and circuits for cognition. Nature Reviews Neuroscience, pages 1–15.
  61. Learning better with dale’s law: A spectral perspective. bioRxiv, pages 2023–06.
  62. Minimax dynamics of optimally balanced spiking networks of excitatory and inhibitory neurons. Advances in Neural Information Processing Systems, 33:4894–4904.
  63. Lindsay, G. W. (2021). Convolutional neural networks as a model of the visual system: Past, present, and future. Journal of cognitive neuroscience, 33(10):2017–2031.
  64. Variations and extension of the convex–concave procedure. Optimization and Engineering, 17:263–287.
  65. Understanding spiking networks through convex optimization. Advances in Neural Information Processing Systems, 33:8824–8835.
  66. Martin, K. A. (1994). A brief history of the “feature detector”. Cerebral cortex, 4(1):1–7.
  67. Natural gradient enables fast sampling in spiking neural networks. Advances in Neural Information Processing Systems, 35:22018–22034.
  68. Linking connectivity, dynamics, and computations in low-rank recurrent neural networks. Neuron, 99(3):609–623.
  69. Melzer, D. (1986). On the expressibility of piecewise-linear continuous functions as the difference of two piecewise-linear convex functions. Quasidifferential Calculus, pages 118–134.
  70. Miconi, T. (2017). Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks. Elife, 6:e20899.
  71. Local dendritic balance enables learning of efficient representations in networks of spiking neurons. Proceedings of the National Academy of Sciences, 118(50):e2021925118.
  72. Neural noise can explain expansive, power-law nonlinearities in neural response functions. Journal of neurophysiology, 87(2):653–659.
  73. Nonlinear computations in spiking neural networks through multiplicative synapses. Peer Community Journal, 1.
  74. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine, 36(6):51–63.
  75. Dimensionality reduction in neuroscience. Current Biology, 26(14):R656–R660.
  76. Solving the problem of negative synaptic weights in cortical models. Neural computation, 20(6):1473–1494.
  77. The asynchronous state in cortical circuits. science, 327(5965):587–590.
  78. Mean-driven and fluctuation-driven persistent activity in recurrent networks. Neural computation, 19(1):1–46.
  79. Rockafellar, R. T. (1970). Convex analysis, volume 11 (1997). Princeton university press.
  80. The spatial structure of correlated neuronal variability. Nature neuroscience, 20(1):107–114.
  81. A balanced memory network. PLoS computational biology, 3(9):e141.
  82. The stabilized supralinear network: a unifying circuit motif underlying multi-input integration in sensory cortex. Neuron, 85(2):402–417.
  83. Poisson balanced spiking networks. PLoS computational biology, 16(11):e1008261.
  84. Inhibitory stabilization and cortical computation. Nature Reviews Neuroscience, 22(1):21–37.
  85. Signatures of criticality in efficient coding networks. bioRxiv, pages 2023–02.
  86. Towards the neural population doctrine. Current opinion in neurobiology, 55:103–111.
  87. Constructing precisely computing networks with biophysical spiking neurons. Journal of Neuroscience, 35(28):10112–10134.
  88. Sepulchre, R. (2022). Spiking control systems. Proceedings of the IEEE.
  89. Control across scales by positive and negative feedback. Annual Review of Control, Robotics, and Autonomous Systems, 2:89–113.
  90. Seung, H. S. (1996). How the brain keeps the eyes still. Proceedings of the National Academy of Sciences, 93(23):13339–13344.
  91. Minimax and hamiltonian dynamics of excitatory-inhibitory networks. Advances in neural information processing systems, 10.
  92. The variable discharge of cortical neurons: implications for connectivity, computation, and information coding. Journal of neuroscience, 18(10):3870–3896.
  93. Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks. PLOS Computational Biology, 19(1):e1010855.
  94. Piecewise linear regression via a difference of convex functions. In International Conference on Machine Learning, pages 8895–8904. PMLR.
  95. Closed-form control with spike coding networks. arXiv preprint arXiv:2212.12887.
  96. The highly irregular firing of cortical cells is inconsistent with temporal integration of random epsps. Journal of neuroscience, 13(1):334–350.
  97. Training excitatory-inhibitory recurrent neural networks for cognitive tasks: a simple and flexible framework. PLoS computational biology, 12(2):e1004792.
  98. Sussillo, D. (2014). Neural circuits as computational dynamical systems. Current opinion in neurobiology, 25:156–163.
  99. Learning universal computations with spikes. PLoS computational biology, 12(6):e1004895.
  100. Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science, 274(5293):1724–1726.
  101. Signal propagation and logic gating in networks of integrate-and-fire neurons. Journal of neuroscience, 25(46):10786–10795.
  102. Computation through neural population dynamics. Annual review of neuroscience, 43:249–275.
  103. Excitatory and inhibitory interactions in localized populations of model neurons. Biophysical journal, 12(1):1–24.
  104. Using goal-driven deep learning models to understand sensory cortex. Nature neuroscience, 19(3):356–365.
  105. The concave-convex procedure. Neural computation, 15(4):915–936.
  106. The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. Neural computation, 33(4):899–925.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
Citations (4)

Summary

We haven't generated a summary for this paper yet.