Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Wireless Federated Learning over Resource-Constrained Networks: Digital versus Analog Transmissions (2405.17759v1)

Published 28 May 2024 in cs.IT and math.IT

Abstract: To enable wireless federated learning (FL) in communication resource-constrained networks, two communication schemes, i.e., digital and analog ones, are effective solutions. In this paper, we quantitatively compare these two techniques, highlighting their essential differences as well as respectively suitable scenarios. We first examine both digital and analog transmission schemes, together with a unified and fair comparison framework under imbalanced device sampling, strict latency targets, and transmit power constraints. A universal convergence analysis under various imperfections is established for evaluating the performance of FL over wireless networks. These analytical results reveal that the fundamental difference between the digital and analog communications lies in whether communication and computation are jointly designed or not. The digital scheme decouples the communication design from FL computing tasks, making it difficult to support uplink transmission from massive devices with limited bandwidth and hence the performance is mainly communication-limited. In contrast, the analog communication allows over-the-air computation (AirComp) and achieves better spectrum utilization. However, the computation-oriented analog transmission reduces power efficiency, and its performance is sensitive to computation errors from imperfect channel state information (CSI). Furthermore, device sampling for both schemes are optimized and differences in sampling optimization are analyzed. Numerical results verify the theoretical analysis and affirm the superior performance of the sampling optimization.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. J. Yao, W. Xu, Z. Yang, X. You, M. Bennis, and H. V. Poor, “Digital versus analog transmissions for federated learning over wireless networks,” accepted by Proc. IEEE Int. Conf. Commun. (ICC), 2024. Available: https://arxiv.org/abs/2402.09657.
  2. W. Saad, M. Bennis, and M. Chen, “A vision of 6G wireless systems: Applications, trends, technologies, and open research problems,” IEEE Netw., vol. 34, no. 3, pp. 134–142, May/Jun. 2020.
  3. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. Arcas, “Communication-efficient learning of deep networks from decentralized data,” in Proc. Int. Conf. Artif. Intell. Statist., 2017, pp. 1273–1282.
  4. Y. Yang, Z. Zhang, and Q. Yang, “Communication-efficient federated learning with binary neural networks,” IEEE J. Sel. Areas Commun., vol. 39, no. 12, pp. 3836–3850, Dec. 2021.
  5. Y. Guo, R. Zhao, S. Lai, L. Fan, X. Lei, and G. K. Karagiannidis, “Distributed machine learning for multiuser mobile edge computing systems,” IEEE J. Sel. Topics Signal Process., vol. 16, no. 3, pp. 460–473, Apr. 2022.
  6. M. Chen, Z. Yang, W. Saad, C. Yin, H. V. Poor, and S. Cui, “A joint learning and communications framework for federated learning over wireless networks,” IEEE Trans. Wireless Commun., vol. 20, no. 1, pp. 269–283, Jan. 2021.
  7. Z. Yang, M. Chen, W. Saad, C. S. Hong, and M. Shikh-Bahaei, “Energy efficient federated learning over wireless communication networks,” IEEE Trans. Wireless Commun., vol. 20, no. 3, pp. 1935–1949, Mar. 2021.
  8. M. M. Amiri and D. Gündüz, “Federated learning over wireless fading channels,” IEEE Trans. Wireless Commun., vol. 19, no. 5, pp. 3546–3557, May 2020.
  9. T. Li, A. K. Sahu, A. Talwalkar, and V. Smith, “Federated learning: Challenges, methods, and future directions,” IEEE Signal Process. Mag., vol. 37, no. 3, pp. 50–60, May 2020.
  10. M. Chen, N. Shlezinger, H. V. Poor, Y. C. Eldar, and S. Cui , “Communication-efficient federated learning,” Proc. Nat. Acad. Sci., vol. 118, no. 17, 2021.
  11. N. Shlezinger, M. Chen, Y. C. Eldar, H. V. Poor, and S. Cui, “UVeQFed: Universal vector quantization for federated learning,” IEEE Trans. Signal Process., vol. 69, pp. 500–514, 2021.
  12. M. Salehi and E. Hossain, “Federated learning in unreliable and resource-constrained cellular wireless networks,” IEEE Trans. Commun., vol. 69, no. 8, pp. 5136–5151, Aug. 2021.
  13. S. Zheng, C. Shen and X. Chen, “Design and analysis of uplink and downlink communications for federated learning,” IEEE J. Sel. Areas Commun., vol. 39, no. 7, pp. 2150–2167, Jul. 2021.
  14. Y. Wang, Y. Xu, Q. Shi, and T.-H. Chang, “Quantized federated learning under transmission delay and outage constraints,” IEEE J. Sel. Areas Commun., vol. 40, no. 1, pp. 323–341, Jan. 2022.
  15. M. Lan, Q. Ling, S. Xiao, and W. Zhang, “Quantization bits allocation for wireless federated learning,” IEEE Trans. Wireless Commun., vol. 22, no. 11, pp. 8336–8351, Nov. 2023.
  16. H. Li, R. Wang, W. Zhang, and J. Wu, “One bit agregation for federated edge learning with reconfigurable intelligent surface: Analysis and optimization,” IEEE Trans. Wireless Commun., vol. 22, no. 2, pp. 872–888, Feb. 2023.
  17. H. Ye, L. Liang, and G. Y. Li, “Decentralized federated learning with unreliable communications,” IEEE J. Sel. Topics Signal Process., vol. 16, no. 3, pp. 487–500, Apr. 2022.
  18. J. Yao, Z. Yang, W. Xu, M. Chen, and D. Niyato, “GoMORE: Global model reuse for rescource-constrained wireless federated learning,” IEEE Wireless Lett., vol. 12, no. 9, pp. 1543–1547, Sept. 2023.
  19. S. Liu, G. Yu, R. Yin, J. Yuan, L. Shen, and C. Liu, “Joint model pruning and device selection for communication-efficient federated edge learning,” IEEE Trans. Commun., vol. 70, no. 1, pp. 231–244, Jan. 2022.
  20. H. H. Yang, Z. Chen, T. Q. S. Quek, and H. V. Poor, “Revisiting analog over-the-air machine learning: The blessing and curse of interference,” IEEE J. Sel. Topics Signal Process., vol. 16, no. 3, pp. 406–419, Apr. 2022.
  21. T. Sery and K. Cohen, “On analog gradient descent learning over multiple access fading channels,” IEEE Trans. Signal Process., vol. 68, pp. 2897–2911, Apr. 2020.
  22. G. Zhu, Y. Wang, and K. Huang, “Broadband analog aggregation for low-latency federated edge learning,” IEEE Trans. Wireless Commun., vol. 19, no. 1, pp. 491–506, Jan. 2020.
  23. X. Cao, G. Zhu, J. Xu, and S. Cui, “Transmission power control for over-the-air federated averaging at network edge,” IEEE J. Sel. Areas Commun., vol. 40, no. 5, pp. 1571–1586, May 2022.
  24. X. Yu, B. Xiao, W. Ni, and X. Wang, “Optimal adaptive power control for over-the-air federated edge learning under fading channels,” IEEE Trans. Commun., vol. 71, no. 9, pp. 5199–5213, Sept. 2023.
  25. K. Yang, T. Jiang, Y. Shi, and Z. Ding, “Federated learning via over-the-air computation,” IEEE Trans. Wireless Commun., vol. 19, no. 3, pp. 2022–2035, Mar. 2020.
  26. J. Yao, Z. Yang, W. Xu, D. Niyato, and X. You, “Imperfect CSI: A key factor of uncertainty to over-the-air federated learning,” IEEE Wireless Lett., vol. 12, no. 12, pp. 2273–2277, Dec. 2023.
  27. K. Guo, Z. Chen, H. H. Yang, and T. Q. S. Quek, “Dynamic scheduling for heterogeneous federated learning in private 5G edge networks,” IEEE J. Sel. Topics Signal Process., vol. 16, no. 1, pp. 26–40, Jan. 2022.
  28. X. Wei and C. Shen, “Federated learning over noisy channels: Convergence analysis and design examples,” IEEE Trans. Cogn. Commun. Netw., vol. 8, no. 2, pp. 1253–1268, Jun. 2022.
  29. D. Liu and O. Simeone, “Privacy for free: Wireless federated learning via uncoded transmission with adaptive power control,” IEEE J. Sel. Areas Commun., vol. 39, no. 1, pp. 170–185, Jan. 2021.
  30. X. Li, G. Zhu, Y. Gong, and K. Huang, “Wirelessly powered data aggregation for IoT via over-the-air function computation: Beamforming and power control,” IEEE Trans. Wireless Commun., vol. 18, no. 7, pp. 3437–3452, Jul. 2019.
  31. Z. Lin, X. Li, V. K. N. Lau, Y. Gong, and K. Huang, “Deploying federated learning in large-scale cellular networks: Spatial convergence analysis,” IEEE Trans. Wireless Commun., vol. 21, no. 3, pp. 1542–1556, Mar. 2022.
  32. M. Mohammadi Amiri and D. Gündüz, “Machine learning at the wireless edge: Distributed stochastic gradient descent over-the-air,” IEEE Trans. Signal Process., vol. 68, pp. 2155–2169, 2020.
  33. H. Xing, O. Simeone, and S. Bi, “Federated learning over wireless device-to-device networks: Algorithms and convergence analysis,” IEEE J. Sel. Areas Commun., vol. 39, no. 12, pp. 3723–3741, Dec. 2021.
  34. E. Rizk, S. Vlaski, and A. H. Sayed, , “Federated learning under importance sampling,” IEEE Trans. Signal Process., vol. 70, pp. 5381–5396, 2022.
  35. W. Dinkelbach, “On nonlinear fractional programming,” Manage. Sci., vol. 133, no. 7, pp. 492–498, Mar. 1967.
  36. M. Grant and S. Boyd. (2016). CVX: MATLAB Software for Disciplined Convex Programming. [Online]. Available: http://cvxr.com/cvx
  37. J. Wang, Y. Mao, T. Wang, and Y. Shi, “Green federated learning over cloud-RAN with limited fronthaul capacity and quantized neural networks,” IEEE Trans. Wireless Commun., vol. 23, no. 5, pp. 4300–4314, May 2024.
  38. M. M. Amiri, D. Gündüz, S. R. Kulkarni, and H. V. Poor, “Convergence of federated learning over a noisy downlink,” IEEE Trans. Wireless Commun., vol. 21, no. 3, pp. 1422–1437, Mar. 2022.
  39. X. Li, K. Huang, W. Yang, S. Wang, and Z. Zhang, “On the convergence of FedAvg on non-iid data.” in Proc. Int. Conf. Learn. Representations (ICLR), 2019.
  40. Y. Sun, Z. Lin, Y. Mao, S. Jin, and J. Zhang, “Channel and gradient-importance aware device scheduling for over-the-air federated learning,” IEEE Trans. Wireless Commun. (Early Access). 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jiacheng Yao (16 papers)
  2. Wei Xu (536 papers)
  3. Zhaohui Yang (193 papers)
  4. Xiaohu You (177 papers)
  5. Mehdi Bennis (333 papers)
  6. H. Vincent Poor (884 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com