Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic Modeling: Proving the Lottery Ticket Hypothesis in Spiking Neural Network (2305.12148v1)

Published 20 May 2023 in cs.LG

Abstract: The Lottery Ticket Hypothesis (LTH) states that a randomly-initialized large neural network contains a small sub-network (i.e., winning tickets) which, when trained in isolation, can achieve comparable performance to the large network. LTH opens up a new path for network pruning. Existing proofs of LTH in Artificial Neural Networks (ANNs) are based on continuous activation functions, such as ReLU, which satisfying the Lipschitz condition. However, these theoretical methods are not applicable in Spiking Neural Networks (SNNs) due to the discontinuous of spiking function. We argue that it is possible to extend the scope of LTH by eliminating Lipschitz condition. Specifically, we propose a novel probabilistic modeling approach for spiking neurons with complicated spatio-temporal dynamics. Then we theoretically and experimentally prove that LTH holds in SNNs. According to our theorem, we conclude that pruning directly in accordance with the weight size in existing SNNs is clearly not optimal. We further design a new criterion for pruning based on our theory, which achieves better pruning results than baseline.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Man Yao (18 papers)
  2. Yuhong Chou (10 papers)
  3. Guangshe Zhao (7 papers)
  4. Xiawu Zheng (63 papers)
  5. Yonghong Tian (184 papers)
  6. Bo Xu (212 papers)
  7. Guoqi Li (90 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.