Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressed Bayesian Federated Learning for Reliable Passive Radio Sensing in Industrial IoT (2405.05855v1)

Published 9 May 2024 in cs.LG, cs.DC, and eess.SP

Abstract: Bayesian Federated Learning (FL) has been recently introduced to provide well-calibrated Machine Learning (ML) models quantifying the uncertainty of their predictions. Despite their advantages compared to frequentist FL setups, Bayesian FL tools implemented over decentralized networks are subject to high communication costs due to the iterated exchange of local posterior distributions among cooperating devices. Therefore, this paper proposes a communication-efficient decentralized Bayesian FL policy to reduce the communication overhead without sacrificing final learning accuracy and calibration. The proposed method integrates compression policies and allows devices to perform multiple optimization steps before sending the local posterior distributions. We integrate the developed tool in an Industrial Internet of Things (IIoT) use case where collaborating nodes equipped with autonomous radar sensors are tasked to reliably localize human operators in a workplace shared with robots. Numerical results show that the developed approach obtains highly accurate yet well-calibrated ML models compatible with the ones provided by conventional (uncompressed) Bayesian FL tools while substantially decreasing the communication overhead (i.e., up to 99%). Furthermore, the proposed approach is advantageous when compared with state-of-the-art compressed frequentist FL setups in terms of calibration, especially when the statistical distribution of the testing dataset changes.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. P. Kairouz et al., “Advances and open problems in federated learning,” Foundations and Trends® in Machine Learning, vol. 14, no. 1–2, pp. 1–210, 2021.
  2. T. Li et al., “Federated learning: Challenges, methods, and future directions,” IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 50–60, 2020.
  3. S. Savazzi et al., “Opportunities of federated learning in connected, cooperative, and automated industrial systems,” IEEE Communications Magazine, vol. 59, no. 2, pp. 16–21, 2021.
  4. L. Barbieri et al., “A carbon tracking model for federated learning: Impact of quantization and sparsification,” in 2023 IEEE 28th International Workshop on Computer Aided Modeling and Design of Communication Links and Networks (CAMAD), 2023, pp. 213–218.
  5. D. C. Nguyen et al., “Federated learning for industrial internet of things in future industries,” IEEE Wireless Communications, vol. 28, no. 6, pp. 192–199, 2021.
  6. P. Boobalan et al., “Fusion of federated learning and industrial internet of things: A survey,” Computer Networks, vol. 212, p. 109048, 2022.
  7. Z. Du et al., “Federated learning for vehicular internet of things: Recent advances and open issues,” IEEE Open Journal of the Computer Society, vol. 1, pp. 45–61, 2020.
  8. L. Barbieri et al., “A layer selection optimizer for communication-efficient decentralized federated deep learning,” IEEE Access, vol. 11, pp. 22 155–22 173, 2023.
  9. D. C. Nguyen et al., “Federated learning for smart healthcare: A survey,” ACM Computing Surveys, vol. 55, no. 3, feb 2022.
  10. B. Camajori Tedeschini et al., “Decentralized federated learning for healthcare networks: A case study on tumor segmentation,” IEEE Access, vol. 10, pp. 8693–8708, 2022.
  11. L. Barbieri et al., “Channel-driven decentralized Bayesian federated learning for trustworthy decision making in D2D networks,” in ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2023, pp. 1–5.
  12. L. Cao et al., “Bayesian federated learning: A survey,” in Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, 8 2023, pp. 7233–7242.
  13. M. Ashman et al., “Partitioned Variational Inference: A Framework for Probabilistic Federated Learning,” arXiv e-prints, p. arXiv:2202.12275, Feb. 2022.
  14. R. Kassab et al., “Federated generalized Bayesian learning via distributed Stein variational gradient descent,” IEEE Transactions on Signal Processing, vol. 70, pp. 2180–2192, 2022.
  15. M. Garbazbalaban et al., “Decentralized stochastic gradient Langevin dynamics and Hamiltonian monte carlo,” Journal of Machine Learning Research, vol. 22, no. 239, pp. 1–69, 2021.
  16. S. Ahn et al., “Distributed stochastic gradient MCMC,” in Proceedings of the 31st International Conference on Machine Learning, 22–24 Jun 2014, pp. 1044–1052.
  17. D. Liu et al., “Wireless federated langevin monte carlo: Repurposing channel noise for bayesian sampling and privacy,” IEEE Transactions on Wireless Communications, vol. 22, no. 5, pp. 2946–2961, 2023.
  18. M. Vono et al., “QLSD: Quantised langevin stochastic dynamics for bayesian federated learning,” in Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, vol. 151, 28–30 Mar 2022, pp. 6459–6500.
  19. A. Karagulyan et al., “ELF: Federated langevin algorithms with primal, dual and bidirectional compression,” in Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities, 2023. [Online]. Available: https://openreview.net/forum?id=wxZ3G1LdJj
  20. S. Lee et al., “Bayesian Federated Learning over Wireless Networks,” arXiv e-prints, p. arXiv:2012.15486, Dec. 2020.
  21. L. Sun et al., “Federated Learning with a Sampling Algorithm under Isoperimetry,” arXiv e-prints, p. arXiv:2206.00920, Jun. 2022.
  22. A. Koloskova et al., “Decentralized stochastic optimization and gossip algorithms with compressed communication,” in Proceedings of the 36th International Conference on Machine Learning, vol. 97, 2019, pp. 3478–3487.
  23. M. Welling et al., “Bayesian learning via stochastic gradient Langevin dynamics,” in Proceedings of the 28th International Conference on Machine Learning, 2011, pp. 681–688.
  24. L. Xiao et al., “Fast linear iterations for distributed averaging,” Systems & Control Letters, vol. 53, no. 1, pp. 65–78, 2004.
  25. D. Alistarh et al., “Qsgd: Communication-efficient sgd via gradient quantization and encoding,” in Advances in Neural Information Processing Systems, vol. 30, 2017.
  26. S. Shi et al., “Understanding top-k sparsification in distributed deep learning,” arXiv e-prints, 2019.
  27. S. Savazzi et al., “An energy and carbon footprint analysis of distributed and federated learning,” IEEE Transactions on Green Communications and Networking, vol. 7, no. 1, pp. 248–264, 2023.
  28. B. Vandersmissen et al., “Indoor person identification using a low-power fmcw radar,” IEEE Transactions on Geoscience and Remote Sensing, vol. 56, no. 7, pp. 3941–3952, 2018.
  29. N. Pedrocchi et al., “Safe human-robot cooperation in an industrial environment,” International Journal of Advanced Robotic Systems, vol. 10, no. 1, p. 27, 2013. [Online]. Available: https://doi.org/10.5772/53939
  30. S. Kianoush et al., “A multisensory edge-cloud platform for opportunistic radio sensing in cobot environments,” IEEE Internet of Things Journal, vol. 8, no. 2, pp. 1154–1168, 2021.
  31. Y. Lecun et al., “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, 1998.
  32. S. Savazzi, “Federated learning: mmwave mimo radar dataset for testing,” 2020. [Online]. Available: https://dx.doi.org/10.21227/0wmc-hq36
  33. C. Guo et al., “On calibration of modern neural networks,” in Proceedings of the 34th International Conference on Machine Learning, 2017, pp. 1321–1330.
  34. H. Xing et al., “Federated learning over wireless device-to-device networks: Algorithms and convergence analysis,” IEEE Journal on Selected Areas in Communications, vol. 39, no. 12, pp. 3723–3741, 2021.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com