Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Communication-Efficient and Distributed Learning Over Wireless Networks: Principles and Applications (2008.02608v1)

Published 6 Aug 2020 in cs.LG, cs.IT, cs.NI, math.IT, and stat.ML

Abstract: Machine learning (ML) is a promising enabler for the fifth generation (5G) communication systems and beyond. By imbuing intelligence into the network edge, edge nodes can proactively carry out decision-making, and thereby react to local environmental changes and disturbances while experiencing zero communication latency. To achieve this goal, it is essential to cater for high ML inference accuracy at scale under time-varying channel and network dynamics, by continuously exchanging fresh data and ML model updates in a distributed way. Taming this new kind of data traffic boils down to improving the communication efficiency of distributed learning by optimizing communication payload types, transmission techniques, and scheduling, as well as ML architectures, algorithms, and data processing methods. To this end, this article aims to provide a holistic overview of relevant communication and ML principles, and thereby present communication-efficient and distributed learning frameworks with selected use cases.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jihong Park (123 papers)
  2. Sumudu Samarakoon (52 papers)
  3. Anis Elgabli (28 papers)
  4. Joongheon Kim (101 papers)
  5. Mehdi Bennis (332 papers)
  6. Seong-Lyun Kim (81 papers)
  7. Mérouane Debbah (634 papers)
Citations (152)