Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
91 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
o3 Pro
5 tokens/sec
GPT-4.1 Pro
15 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
Gemini 2.5 Flash Deprecated
12 tokens/sec
2000 character limit reached

CCSRP: Robust Pruning of Spiking Neural Networks through Cooperative Coevolution (2408.00794v1)

Published 18 Jul 2024 in cs.NE, cs.AI, and cs.CV

Abstract: Spiking neural networks (SNNs) have shown promise in various dynamic visual tasks, yet those ready for practical deployment often lack the compactness and robustness essential in resource-limited and safety-critical settings. Prior research has predominantly concentrated on enhancing the compactness or robustness of artificial neural networks through strategies like network pruning and adversarial training, with little exploration into similar methodologies for SNNs. Robust pruning of SNNs aims to reduce computational overhead while preserving both accuracy and robustness. Current robust pruning approaches generally necessitate expert knowledge and iterative experimentation to establish suitable pruning criteria or auxiliary modules, thus constraining their broader application. Concurrently, evolutionary algorithms (EAs) have been employed to automate the pruning of artificial neural networks, delivering remarkable outcomes yet overlooking the aspect of robustness. In this work, we propose CCSRP, an innovative robust pruning method for SNNs, underpinned by cooperative co-evolution. Robust pruning is articulated as a tri-objective optimization challenge, striving to balance accuracy, robustness, and compactness concurrently, resolved through a cooperative co-evolutionary pruning framework that independently prunes filters across layers using EAs. Our experiments on CIFAR-10 and SVHN demonstrate that CCSRP can match or exceed the performance of the latest methodologies.

Summary

We haven't generated a summary for this paper yet.