Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Design Principles for Lifelong Learning AI Accelerators (2310.04467v1)

Published 5 Oct 2023 in cs.LG, cs.AI, cs.SY, and eess.SY

Abstract: Lifelong learning - an agent's ability to learn throughout its lifetime - is a haLLMark of biological learning systems and a central challenge for AI. The development of lifelong learning algorithms could lead to a range of novel AI applications, but this will also require the development of appropriate hardware accelerators, particularly if the models are to be deployed on edge platforms, which have strict size, weight, and power constraints. Here, we explore the design of lifelong learning AI accelerators that are intended for deployment in untethered environments. We identify key desirable capabilities for lifelong learning accelerators and highlight metrics to evaluate such accelerators. We then discuss current edge AI accelerators and explore the future design of lifelong learning accelerators, considering the role that different emerging technologies could play.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Dhireesha Kudithipudi (31 papers)
  2. Anurag Daram (6 papers)
  3. Abdullah M. Zyarah (9 papers)
  4. Fatima Tuz Zohora (4 papers)
  5. James B. Aimone (26 papers)
  6. Angel Yanguas-Gil (18 papers)
  7. Nicholas Soures (7 papers)
  8. Emre Neftci (46 papers)
  9. Matthew Mattina (35 papers)
  10. Vincenzo Lomonaco (58 papers)
  11. Clare D. Thiem (3 papers)
  12. Benjamin Epstein (2 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.