On Leaky-Integrate-and Fire as Spike-Train-Quantization Operator on Dirac-Superimposed Continuous-Time Signals (2402.07954v1)
Abstract: Leaky-integrate-and-fire (LIF) is studied as a non-linear operator that maps an integrable signal $f$ to a sequence $\eta_f$ of discrete events, the spikes. In the case without any Dirac pulses in the input, it makes no difference whether to set the neuron's potential to zero or to subtract the threshold $\vartheta$ immediately after a spike triggering event. However, in the case of superimpose Dirac pulses the situation is different which raises the question of a mathematical justification of each of the proposed reset variants. In the limit case of zero refractory time the standard reset scheme based on threshold subtraction results in a modulo-based reset scheme which allows to characterize LIF as a quantization operator based on a weighted Alexiewicz norm $|.|{A, \alpha}$ with leaky parameter $\alpha$. We prove the quantization formula $|\eta_f - f|{A, \alpha} < \vartheta$ under the general condition of local integrability, almost everywhere boundedness and locally finitely many superimposed weighted Dirac pulses which provides a much larger signal space and more flexible sparse signal representation than manageable by classical signal processing.
- Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. Cambridge University Press, USA, 2014. ISBN 1107635195.
- Spiking neural networks: A survey. IEEE Access, 10:60738–60764, 2022. doi:10.1109/ACCESS.2022.3179968.
- Bernhard A. Moser. Similarity recovery from threshold-based sampling under general conditions. IEEE Transactions on Signal Processing, 65(17):4645–4654, 2017. doi:10.1109/TSP.2017.2712121.
- On quasi-isometry of threshold-based sampling. IEEE Transactions on Signal Processing, 67(14):3832–3841, 2019. doi:10.1109/TSP.2019.2919415.
- Spiking neural networks in the Alexiewicz topology: A new perspective on analysis and error bounds. arXiv preprint arXiv:2305.05772, 2023. URL https://doi.org/10.48550/arXiv.2305.05772.
- Robert G Bartle. Modern Theory of Integration. Graduate Studies In Mathematics. American Mathematical Society, India, 2001.
- Theories of Integration: The Integrals of Riemann, Lebesgue, Henstock-Kurzweil, and McShane. Series in Real Analysis. Singapore: World Scientific, 2004. ISBN 9812388435.
- Training spiking neural networks using lessons from deep learning. arXiv, 2021. doi:10.48550/ARXIV.2109.12894. URL https://arxiv.org/abs/2109.12894.
- Roman Vershynin. High-Dimensional Probability: An Introduction with Applications in Data Science. Number 47 in Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, 2018. ISBN 978-1-108-41519-4.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.