Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TDE-3: An improved prior for optical flow computation in spiking neural networks (2402.11662v1)

Published 18 Feb 2024 in cs.NE

Abstract: Motion detection is a primary task required for robotic systems to perceive and navigate in their environment. Proposed in the literature bioinspired neuromorphic Time-Difference Encoder (TDE-2) combines event-based sensors and processors with spiking neural networks to provide real-time and energy-efficient motion detection through extracting temporal correlations between two points in space. However, on the algorithmic level, this design leads to loss of direction-selectivity of individual TDEs in textured environments. Here we propose an augmented 3-point TDE (TDE-3) with additional inhibitory input that makes TDE-3 direction-selectivity robust in textured environments. We developed a procedure to train the new TDE-3 using backpropagation through time and surrogate gradients to linearly map input velocities into an output spike count or an Inter-Spike Interval (ISI). Our work is the first instance of training a spiking neuron to have a specific ISI. Using synthetic data we compared training and inference with spike count and ISI with respect to changes in stimuli dynamic range, spatial frequency, and level of noise. ISI turns out to be more robust towards variation in spatial frequency, whereas the spike count is a more reliable training signal in the presence of noise. We performed the first in-depth quantitative investigation of optical flow coding with TDE and compared TDE-2 vs TDE-3 in terms of energy-efficiency and coding precision. Results show that on the network level both detectors show similar precision (20 degree angular error, 88% correlation with ground truth). Yet, due to the more robust direction-selectivity of individual TDEs, TDE-3 based network spike less and hence is more energy-efficient. Reported precision is on par with model-based methods but the spike-based processing of the TDEs provides allows more energy-efficient inference with neuromorphic hardware.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. A 128×\times× 128 120 db 15 μ𝜇\muitalic_μs latency asynchronous temporal contrast vision sensor. IEEE Journal of Solid-State Circuits, 43(2):566–576, 2008.
  2. Event-based vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(01):154–180, jan 2022.
  3. Wolfgang Maass. Networks of spiking neurons: The third generation of neural network models. Neural Networks, 10(9):1659–1671, 1997.
  4. Self-supervised learning of event-based optical flow with spiking neural networks, 2021.
  5. Adaptive-spikenet: Event-based optical flow estimation using spiking neural networks with learnable neuronal dynamics. In 2023 IEEE International Conference on Robotics and Automation (ICRA), pages 6021–6027, 2023.
  6. Optical flow estimation from event-based cameras and spiking neural networks. Frontiers in Neuroscience, 17, 2023.
  7. Event-based optical flow estimation with spatio-temporal backpropagation trained spiking neural network. Micromachines, 14(1), 2023.
  8. Loihi: A neuromorphic manycore processor with on-chip learning. Ieee Micro, 38(1):82–99, 2018.
  9. Fully neuromorphic vision and control for autonomous drone flight, 2023.
  10. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5:115–133, 1943.
  11. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359–366, 1989.
  12. Learning without feedback: Fixed random learning signals allow for feedforward training of deep neural networks. Frontiers in Neuroscience, 15, 2021.
  13. Parallel computations in insect and mammalian visual motion processing. Current Biology, 26(20):R1062–R1072, 2016.
  14. Optimal local estimates of visual motion in a natural environment. Phys. Rev. Lett., 126:018101, Jan 2021.
  15. Statistical mechanics and visual signal processing. Journal de Physique I, 4(11):1755–1775, nov 1994.
  16. Low cost and latency event camera background activity denoising. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(1):785–795, 2023.
  17. On event-based optical flow detection. Frontiers in Neuroscience, 9, 2015.
  18. Evaluating noise filtering for event-based asynchronous change detection image sensors. pages 19–24, 2016.
  19. Temperature and parasitic photocurrent effects in dynamic vision sensors. IEEE Transactions on Electron Devices, 64(8):3239–3245, 2017.
  20. E-MLB: Multilevel benchmark for event-based camera denoising. IEEE Transactions on Multimedia, pages 1–12, 2023.
  21. Eero P. Simoncelli. Distributed representation and analysis of visual motion. 1993.
  22. Unsupervised learning of a hierarchical spiking neural network for optical flow estimation: From events to global motion perception. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(8):2051–2064, aug 2020.
  23. Bio-inspired motion estimation with event-driven sensors. In Ignacio Rojas, Gonzalo Joya, and Andreu Catala, editors, Advances in Computational Intelligence, pages 309–321, Cham, 2015. Springer International Publishing.
  24. Spike-based motion estimation for object tracking through bio-inspired unsupervised learning. IEEE Transactions on Image Processing, 32:335–349, 2023.
  25. A spiking neural network architecture for visual motion estimation. In 2013 IEEE Biomedical Circuits and Systems Conference (BioCAS), volume 1, pages 298–301. IEEE, 2013.
  26. Event-based optical flow on neuromorphic hardware. In Proceedings of the 9th EAI International Conference on Bio-Inspired Information and Communications Technologies (Formerly BIONETICS), BICT’15, pages 551–558, Brussels, BEL, 2016. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering).
  27. Spiking elementary motion detector in neuromorphic systems. Neural Computation, 30(9):2384–2417, 2018.
  28. Finding the gap: neuromorphic motion-vision in dense environments. Nature Communications, 15(1):817, 01 2024.
  29. Event-based computation of motion flow on a neuromorphic analog neural platform. Frontiers in Neuroscience, 10:1–13, Feb 2016.
  30. Event-based eccentric motion detection exploiting time difference encoding. Frontiers in Neuroscience, 14:1–14, May 2020.
  31. An event-based digital time difference encoder model implementation for neuromorphic systems. IEEE Transactions on Neural Networks and Learning Systems, 33(5):1959–1973, 2022.
  32. Spiking optical flow for event-based sensors using ibm’s truenorth neurosynaptic system. IEEE Transactions on Biomedical Circuits and Systems, 12(4):860–870, 2018.
  33. Low-latency monocular depth estimation using event timing on neuromorphic hardware. In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pages 4071–4080, 2023.
  34. Werner Reichardt. Autocorrelation, a principle for evaluation of sensory information by the central nervous system. In Walter A. Rosenblith, editor, Sensory Communication, pages 303–317. MIT Press, Cambridge, MA, USA, 1961.
  35. A comparison of temporal encoders for neuromorphic keyword spotting with few neurons, 2023.
  36. How fly neurons compute the direction of visual motion. Journal of Comparative Physiology A: Neuroethology, Sensory, Neural, and Behavioral Physiology, 206(2):109–124, 2020.
  37. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine, 36(6), 2019.
  38. The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. Neural computation, 33(4):899–925, 2021.
  39. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor. Frontiers in Neuroscience, 10, 2016.
  40. Spikes: exploring the neural code. MIT Press, Cambridge, MA, USA, 1999.
  41. Seeing things in motion: Models, circuits, and mechanisms. Neuron, 71(6):974–994, 2011.
  42. Karl Pearson. Note on regression and inheritance in the case of two parents. Proceedings of the Royal Society of London Series I, 58:240–242, January 1895.
  43. Low-power event-based face detection with asynchronous neuromorphic hardware, 2023.
  44. Ev-imo: Motion segmentation dataset and learning pipeline for event cameras. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 6105–6112, Nov 2019.
  45. Optic flow-based collision-free strategies: From insects to robots. Arthropod Structure & Development, 46(5):703–717, 2017. From Insects to Robots.
  46. Efficient neuromorphic signal processing with loihi 2, 2021.
  47. Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In Proceedings of the IEEE International Conference on Computer Vision, volume 1, pages 2641–2651, 2021.
  48. Friedemann Zenke. SpyTorch (v0.3), March 2019.
  49. Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering, 2(4):044016, dec 2022.
  50. Rapid neural coding in the retina with relative spike latencies. Science, 319(5866):1108–1111, 2008.
  51. Computing with Spiking Neuron Networks, pages 335–376. Springer Berlin Heidelberg, Berlin, Heidelberg, 2012.
  52. Activity correlations between direction-selective retinal ganglion cells synergistically enhance motion decoding from complex visual scenes. Neuron, 101(5):963–976.e7, 2019.
  53. Principles of Neural Design. The MIT Press, 06 2015.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets