Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Survey of Machine Learning Accelerators (2009.00993v1)

Published 1 Sep 2020 in cs.DC and cs.LG

Abstract: New machine learning accelerators are being announced and released each month for a variety of applications from speech recognition, video object detection, assisted driving, and many data center applications. This paper updates the survey of of AI accelerators and processors from last year's IEEE-HPEC paper. This paper collects and summarizes the current accelerators that have been publicly announced with performance and power consumption numbers. The performance and power values are plotted on a scatter graph and a number of dimensions and observations from the trends on this plot are discussed and analyzed. For instance, there are interesting trends in the plot regarding power consumption, numerical precision, and inference versus training. This year, there are many more announced accelerators that are implemented with many more architectures and technologies from vector engines, dataflow engines, neuromorphic designs, flash-based analog memory processing, and photonic-based processing.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Albert Reuther (74 papers)
  2. Peter Michaleas (68 papers)
  3. Michael Jones (92 papers)
  4. Vijay Gadepally (131 papers)
  5. Siddharth Samsi (74 papers)
  6. Jeremy Kepner (141 papers)
Citations (122)