Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SuperScaler: Supporting Flexible DNN Parallelization via a Unified Abstraction (2301.08984v1)

Published 21 Jan 2023 in cs.DC and cs.AI

Abstract: With the growing model size, deep neural networks (DNN) are increasingly trained over massive GPU accelerators, which demands a proper parallelization plan that transforms a DNN model into fine-grained tasks and then schedules them to GPUs for execution. Due to the large search space, the contemporary parallelization plan generators often rely on empirical rules that couple transformation and scheduling, and fall short in exploring more flexible schedules that yield better memory usage and compute efficiency. This tension can be exacerbated by the emerging models with increasing complexity in their structure and model size. SuperScaler is a system that facilitates the design and generation of highly flexible parallelization plans. It formulates the plan design and generation into three sequential phases explicitly: model transformation, space-time scheduling, and data dependency preserving. Such a principled approach decouples multiple seemingly intertwined factors and enables the composition of highly flexible parallelization plans. As a result, SuperScaler can not only generate empirical parallelization plans, but also construct new plans that achieve up to 3.5X speedup compared to state-of-the-art solutions like DeepSpeed, Megatron and Alpa, for emerging DNN models like Swin-Transformer and AlphaFold2, as well as well-optimized models like GPT-3.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Zhiqi Lin (9 papers)
  2. Youshan Miao (11 papers)
  3. Guodong Liu (133 papers)
  4. Xiaoxiang Shi (5 papers)
  5. Quanlu Zhang (14 papers)
  6. Fan Yang (878 papers)
  7. Saeed Maleki (19 papers)
  8. Yi Zhu (233 papers)
  9. Xu Cao (89 papers)
  10. Cheng Li (1094 papers)
  11. Mao Yang (62 papers)
  12. Lintao Zhang (15 papers)
  13. Lidong Zhou (12 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.