Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fast On-Device Adaptation for Spiking Neural Networks via Online-Within-Online Meta-Learning (2103.03901v1)

Published 21 Feb 2021 in cs.NE, cs.LG, and eess.SP

Abstract: Spiking Neural Networks (SNNs) have recently gained popularity as machine learning models for on-device edge intelligence for applications such as mobile healthcare management and natural language processing due to their low power profile. In such highly personalized use cases, it is important for the model to be able to adapt to the unique features of an individual with only a minimal amount of training data. Meta-learning has been proposed as a way to train models that are geared towards quick adaptation to new tasks. The few existing meta-learning solutions for SNNs operate offline and require some form of backpropagation that is incompatible with the current neuromorphic edge-devices. In this paper, we propose an online-within-online meta-learning rule for SNNs termed OWOML-SNN, that enables lifelong learning on a stream of tasks, and relies on local, backprop-free, nested updates.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Bleema Rosenfeld (3 papers)
  2. Bipin Rajendran (50 papers)
  3. Osvaldo Simeone (326 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.