Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Split Learning over Wireless Networks: Parallel Design and Resource Management (2204.08119v2)

Published 18 Apr 2022 in cs.NI

Abstract: Split learning (SL) is a collaborative learning framework, which can train an AI model between a device and an edge server by splitting the AI model into a device-side model and a server-side model at a cut layer. The existing SL approach conducts the training process sequentially across devices, which incurs significant training latency especially when the number of devices is large. In this paper, we design a novel SL scheme to reduce the training latency, named Cluster-based Parallel SL (CPSL) which conducts model training in a "first-parallel-then-sequential" manner. Specifically, the CPSL is to partition devices into several clusters, parallelly train device-side models in each cluster and aggregate them, and then sequentially train the whole AI model across clusters, thereby parallelizing the training process and reducing training latency. Furthermore, we propose a resource management algorithm to minimize the training latency of CPSL considering device heterogeneity and network dynamics in wireless networks. This is achieved by stochastically optimizing the cut layer selection, real-time device clustering, and radio spectrum allocation. The proposed two-timescale algorithm can jointly make the cut layer selection decision in a large timescale and device clustering and radio spectrum allocation decisions in a small timescale. Extensive simulation results on non-independent and identically distributed data demonstrate that the proposed solutions can greatly reduce the training latency as compared with the existing SL benchmarks, while adapting to network dynamics.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Wen Wu (103 papers)
  2. Mushu Li (27 papers)
  3. Kaige Qu (10 papers)
  4. Conghao Zhou (37 papers)
  5. Xuemin (104 papers)
  6. Shen (108 papers)
  7. Weihua Zhuang (49 papers)
  8. Xu Li (126 papers)
  9. Weisen Shi (9 papers)
Citations (109)

Summary

We haven't generated a summary for this paper yet.