Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Survey of Large-Scale Deep Learning Serving System Optimization: Challenges and Opportunities (2111.14247v2)

Published 28 Nov 2021 in cs.LG and cs.DC

Abstract: Deep Learning (DL) models have achieved superior performance in many application domains, including vision, language, medical, commercial ads, entertainment, etc. With the fast development, both DL applications and the underlying serving hardware have demonstrated strong scaling trends, i.e., Model Scaling and Compute Scaling, for example, the recent pre-trained model with hundreds of billions of parameters with ~TB level memory consumption, as well as the newest GPU accelerators providing hundreds of TFLOPS. With both scaling trends, new problems and challenges emerge in DL inference serving systems, which gradually trends towards Large-scale Deep learning Serving systems (LDS). This survey aims to summarize and categorize the emerging challenges and optimization opportunities for large-scale deep learning serving systems. By providing a novel taxonomy, summarizing the computing paradigms, and elaborating the recent technique advances, we hope that this survey could shed light on new optimization perspectives and motivate novel works in large-scale deep learning system optimization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Fuxun Yu (39 papers)
  2. Di Wang (407 papers)
  3. Longfei Shangguan (11 papers)
  4. Minjia Zhang (54 papers)
  5. Xulong Tang (23 papers)
  6. Chenchen Liu (24 papers)
  7. Xiang Chen (343 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.