Gradient Coding in Decentralized Learning for Evading Stragglers (2402.04193v3)
Abstract: In this paper, we consider a decentralized learning problem in the presence of stragglers. Although gradient coding techniques have been developed for distributed learning to evade stragglers, where the devices send encoded gradients with redundant training data, it is difficult to apply those techniques directly to decentralized learning scenarios. To deal with this problem, we propose a new gossip-based decentralized learning method with gradient coding (GOCO). In the proposed method, to avoid the negative impact of stragglers, the parameter vectors are updated locally using encoded gradients based on the framework of stochastic gradient coding and then averaged in a gossip-based manner. We analyze the convergence performance of GOCO for strongly convex loss functions. And we also provide simulation results to demonstrate the superiority of the proposed method in terms of learning performance compared with the baseline methods.
- S. V. Balkus, H. Wang, B. D. Cornet, C. Mahabal, H. Ngo, and H. Fang, “A survey of collaborative machine learning using 5g vehicular communications,” IEEE Communications Surveys & Tutorials, vol. 24, no. 2, pp. 1280–1303, 2022.
- J. Feng, W. Zhang, Q. Pei, J. Wu, and X. Lin, “Heterogeneous computation and resource allocation for wireless powered federated edge learning systems,” IEEE Transactions on Communications, vol. 70, no. 5, pp. 3220–3233, 2022.
- J. Liu, J. Huang, Y. Zhou, X. Li, S. Ji, H. Xiong, and D. Dou, “From distributed machine learning to federated learning: A survey,” Knowledge and Information Systems, vol. 64, no. 4, pp. 885–917, 2022.
- E. T. M. Beltrán, M. Q. Pérez, P. M. S. Sánchez, S. L. Bernal, G. Bovet, M. G. Pérez, G. M. Pérez, and A. H. Celdrán, “Decentralized federated learning: Fundamentals, state of the art, frameworks, trends, and challenges,” IEEE Communications Surveys & Tutorials, 2023.
- C. Li, G. Li, and P. K. Varshney, “Communication-efficient federated learning based on compressed sensing,” IEEE Internet of Things Journal, vol. 8, no. 20, pp. 15 531–15 541, 2021.
- A. Beznosikov, S. Horváth, P. Richtárik, and M. Safaryan, “On biased compression for distributed learning,” Journal of Machine Learning Research, vol. 24, no. 276, pp. 1–50, 2023.
- H. Taheri and C. Thrampoulidis, “On generalization of decentralized learning with separable data,” in International Conference on Artificial Intelligence and Statistics. PMLR, 2023, pp. 4917–4945.
- C. Li, G. Li, and P. K. Varshney, “Decentralized federated learning via mutual knowledge transfer,” IEEE Internet of Things Journal, vol. 9, no. 2, pp. 1136–1147, 2021.
- Q. Yu, M. A. Maddah-Ali, and A. S. Avestimehr, “Straggler mitigation in distributed matrix multiplication: Fundamental limits and optimal coding,” IEEE Transactions on Information Theory, vol. 66, no. 3, pp. 1920–1933, 2020.
- R. Tandon, Q. Lei, A. G. Dimakis, and N. Karampatziakis, “Gradient coding: Avoiding stragglers in distributed learning,” in International Conference on Machine Learning. PMLR, 2017, pp. 3368–3376.
- E. Ozfatura, D. Gündüz, and S. Ulukus, “Gradient coding with clustering and multi-message communication,” in 2019 IEEE Data Science Workshop (DSW). IEEE, 2019, pp. 42–46.
- B. Buyukates, E. Ozfatura, S. Ulukus, and D. Gündüz, “Gradient coding with dynamic clustering for straggler-tolerant distributed learning,” IEEE Transactions on Communications, 2022.
- M. Glasgow and M. Wootters, “Approximate gradient coding with optimal decoding,” IEEE Journal on Selected Areas in Information Theory, vol. 2, no. 3, pp. 855–866, 2021.
- H. Wang, Z. Charles, and D. Papailiopoulos, “Erasurehead: Distributed gradient descent without delays using approximate gradient coding,” arXiv preprint arXiv:1901.09671, 2019.
- R. Bitar, M. Wootters, and S. El Rouayheb, “Stochastic gradient coding for straggler mitigation in distributed learning,” IEEE Journal on Selected Areas in Information Theory, vol. 1, no. 1, pp. 277–291, 2020.
- C. Li and M. Skoglund, “Distributed learning based on 1-bit gradient coding in the presence of stragglers,” unpublished, 2023.
- A. Koloskova, S. Stich, and M. Jaggi, “Decentralized stochastic optimization and gossip algorithms with compressed communication,” in International Conference on Machine Learning. PMLR, 2019, pp. 3478–3487.
- L. Ding, K. Jin, B. Ying, K. Yuan, and W. Yin, “Dsgd-ceca: Decentralized sgd with communication-optimal exact consensus algorithm,” arXiv preprint arXiv:2306.00256, 2023.