Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
51 tokens/sec
2000 character limit reached

Neuronal Temporal Filters as Normal Mode Extractors (2401.03248v1)

Published 6 Jan 2024 in q-bio.NC, cs.SY, eess.SY, nlin.CD, and stat.ML

Abstract: To generate actions in the face of physiological delays, the brain must predict the future. Here we explore how prediction may lie at the core of brain function by considering a neuron predicting the future of a scalar time series input. Assuming that the dynamics of the lag vector (a vector composed of several consecutive elements of the time series) are locally linear, Normal Mode Decomposition decomposes the dynamics into independently evolving (eigen-)modes allowing for straightforward prediction. We propose that a neuron learns the top mode and projects its input onto the associated subspace. Under this interpretation, the temporal filter of a neuron corresponds to the left eigenvector of a generalized eigenvalue problem. We mathematically analyze the operation of such an algorithm on noisy observations of synthetic data generated by a linear system. Interestingly, the shape of the temporal filter varies with the signal-to-noise ratio (SNR): a noisy input yields a monophasic filter and a growing SNR leads to multiphasic filters with progressively greater number of phases. Such variation in the temporal filter with input SNR resembles that observed experimentally in biological neurons.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. Sensory cortex is optimized for prediction of future input. Elife, 7:e31557, 2018.
  2. Toward a unified theory of efficient, predictive, and sparse coding. Proceedings of the National Academy of Sciences, 115(1):186–191, 2018.
  3. Predictive coding: a fresh view of inhibition in the retina. Proceedings of the Royal Society of London. Series B. Biological Sciences, 216(1205):427–459, 1982.
  4. Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. Nature neuroscience, 2(1):79–87, 1999.
  5. Predictive coding. Wiley Interdisciplinary Reviews: Cognitive Science, 2(5):580–593, 2011.
  6. A lattice filter model of the visual pathway. Advances in Neural Information Processing Systems, 25, 2012.
  7. The information bottleneck method. arXiv preprint physics/0004057, 2000.
  8. Efficient representation as a design principle for neural coding and computation. In 2006 IEEE international symposium on information theory, pages 659–663. IEEE, 2006.
  9. Efficient coding, channel capacity and the emergence of retinal mosaics. Advances in Neural Information Processing Systems, 2022.
  10. S. Wiggins. Introduction to Applied Nonlinear Dynamical Systems and Chaos. Texts in Applied Mathematics. Springer New York, 2006.
  11. Mechanics, Third Edition: Volume 1 (Course of Theoretical Physics). Butterworth-Heinemann, 3 edition, January 1976.
  12. Fast and slow contrast adaptation in retinal circuitry. Neuron, 36(5):909–919, 2002.
  13. Spike-triggered covariance analysis reveals phenomenological diversity of contrast adaptation in the retina. PLoS computational biology, 11(7):e1004425, 2015.
  14. Divya Chander and EJ Chichilnisky. Adaptation to temporal contrast in primate and salamander retina. Journal of Neuroscience, 21(24):9904–9916, 2001.
  15. Temporal contrast adaptation in the input and output signals of salamander retinal ganglion cells. Journal of Neuroscience, 21(1):287–299, 2001.
  16. Reliability of spike timing in neocortical neurons. Science, 268(5216):1503–1506, 1995.
  17. In a neuron receiving synapses from multiple sources this would correspond to considering the total synaptic current as a scalar input.
  18. Richard Bellman. Introduction to matrix analysis. SIAM, 1997.
  19. Characterizing and correcting for the effect of sensor noise in the dynamic mode decomposition. Experiments in Fluids, 57(3):1–19, 2016.
  20. Stable dynamic mode decomposition algorithm for noisy pressure-sensitive-paint measurement data. AIAA Journal, 60(3):1965–1970, 2022.
  21. A hebbian/anti-hebbian neural network for linear subspace learning: A derivation from multidimensional scaling of streaming data. Neural computation, 27(7):1461–1495, 2015.
  22. Why do similarity matching objectives lead to hebbian/anti-hebbian networks? Neural computation, 30(1):84–124, 2017.
  23. Neuroscience-inspired online unsupervised learning algorithms: Artificial neural networks. IEEE Signal Processing Magazine, 36(6):88–96, 2019.
  24. Normative framework for deriving neural networks with multicompartmental neurons and non-hebbian plasticity. PRX Life, 1:013008, Aug 2023.
  25. Spectra and Pseudospectra: The Behavior of Nonnormal Matrices and Operators. Princeton University Press, 2020.
  26. EA Rawashdeh. A simple method for finding the inverse matrix of Vandermonde matrix. Matematicki vesnik, 3(71):207–213, 2019.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.