Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Backpropagation-free Spiking Neural Networks with the Forward-Forward Algorithm (2502.20411v2)

Published 19 Feb 2025 in cs.NE, cs.AI, and cs.LG

Abstract: Spiking Neural Networks (SNNs) offer a biologically inspired computational paradigm that emulates neuronal activity through discrete spike-based processing. Despite their advantages, training SNNs with traditional backpropagation (BP) remains challenging due to computational inefficiencies and a lack of biological plausibility. This study explores the Forward-Forward (FF) algorithm as an alternative learning framework for SNNs. Unlike backpropagation, which relies on forward and backward passes, the FF algorithm employs two forward passes, enabling layer-wise localized learning, enhanced computational efficiency, and improved compatibility with neuromorphic hardware. We introduce an FF-based SNN training framework and evaluate its performance across both non-spiking (MNIST, Fashion-MNIST, Kuzushiji-MNIST) and spiking (Neuro-MNIST, SHD) datasets. Experimental results demonstrate that our model surpasses existing FF-based SNNs on evaluated static datasets with a much lighter architecture while achieving accuracy comparable to state-of-the-art backpropagation-trained SNNs. On more complex spiking tasks such as SHD, our approach outperforms other SNN models and remains competitive with leading backpropagation-trained SNNs. These findings highlight the FF algorithm's potential to advance SNN training methodologies by addressing some key limitations of backpropagation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Mohammadnavid Ghader (1 paper)
  2. Saeed Reza Kheradpisheh (20 papers)
  3. Bahar Farahani (3 papers)
  4. Mahmood Fazlali (2 papers)

Summary

We haven't generated a summary for this paper yet.