Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
53 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
10 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

The Potential of Combined Learning Strategies to Enhance Energy Efficiency of Spiking Neuromorphic Systems (2408.07150v1)

Published 13 Aug 2024 in cs.NE

Abstract: Ensuring energy-efficient design in neuromorphic computing systems necessitates a tailored architecture combined with algorithmic approaches. This manuscript focuses on enhancing brain-inspired perceptual computing machines through a novel combined learning approach for Convolutional Spiking Neural Networks (CSNNs). CSNNs present a promising alternative to traditional power-intensive and complex machine learning methods like backpropagation, offering energy-efficient spiking neuron processing inspired by the human brain. The proposed combined learning method integrates Pair-based Spike Timing-Dependent Plasticity (PSTDP) and power law-dependent Spike-timing-dependent plasticity (STDP) to adjust synaptic efficacies, enabling the utilization of stochastic elements like memristive devices to enhance energy efficiency and improve perceptual computing accuracy. By reducing learning parameters while maintaining accuracy, these systems consume less energy and have reduced area overhead, making them more suitable for hardware implementation. The research delves into neuromorphic design architectures, focusing on CSNNs to provide a general framework for energy-efficient computing hardware. Various CSNN architectures are evaluated to assess how less trainable parameters can maintain acceptable accuracy in perceptual computing systems, positioning them as viable candidates for neuromorphic architecture. Comparisons with previous work validate the achievements and methodology of the proposed architecture.

Summary

We haven't generated a summary for this paper yet.