Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LocMoE: A Low-Overhead MoE for Large Language Model Training (2401.13920v3)

Published 25 Jan 2024 in cs.LG, cs.AI, and cs.CL

Abstract: The Mixtures-of-Experts (MoE) model is a widespread distributed and integrated learning method for LLMs (LLM), which is favored due to its ability to sparsify and expand models efficiently. However, the performance of MoE is limited by load imbalance and high latency of All-to-All communication, along with relatively redundant computation owing to large expert capacity. Load imbalance may result from existing routing policies that consistently tend to select certain experts. The frequent inter-node communication in the All-to-All procedure also significantly prolongs the training time. To alleviate the above performance problems, we propose a novel routing strategy that combines load balance and locality by converting partial inter-node communication to that of intra-node. Notably, we elucidate that there is a minimum threshold for expert capacity, calculated through the maximal angular deviation between the gating weights of the experts and the assigned tokens. We port these modifications on the PanGu-Sigma model based on the MindSpore framework with multi-level routing and conduct experiments on Ascend clusters. The experiment results demonstrate that the proposed LocMoE reduces training time per epoch by 12.68% to 22.24% compared to classical routers, such as hash router and switch router, without impacting the model accuracy.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Jing Li (621 papers)
  2. Zhijie Sun (7 papers)
  3. Xuan He (37 papers)
  4. Li Zeng (68 papers)
  5. Yi Lin (103 papers)
  6. Entong Li (4 papers)
  7. Binfan Zheng (5 papers)
  8. Rongqian Zhao (4 papers)
  9. Xin Chen (456 papers)
Citations (6)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets