Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust Collaborative Inference with Vertically Split Data Over Dynamic Device Environments (2312.16638v3)

Published 27 Dec 2023 in cs.LG

Abstract: When each edge device of a network only perceives a local part of the environment, collaborative inference across multiple devices is often needed to predict global properties of the environment. In safety-critical applications, collaborative inference must be robust to significant network failures caused by environmental disruptions or extreme weather. Existing collaborative learning approaches, such as privacy-focused Vertical Federated Learning (VFL), typically assume a centralized setup or that one device never fails. However, these assumptions make prior approaches susceptible to significant network failures. To address this problem, we first formalize the problem of robust collaborative inference over a dynamic network of devices that could experience significant network faults. Then, we develop a minimalistic yet impactful method called Multiple Aggregation with Gossip Rounds and Simulated Faults (MAGS) that synthesizes simulated faults via dropout, replication, and gossiping to significantly improve robustness over baselines. We also theoretically analyze our proposed approach to explain why each component enhances robustness. Extensive empirical results validate that MAGS is robust across a range of fault rates-including extreme fault rates.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. Baran, P. (1964). On Distributed Communications: I. Introduction to Distributed Communications Networks. RAND Corporation, Santa Monica, CA.
  2. Greedy layerwise learning can scale to imagenet. In International conference on machine learning, pages 583–593. PMLR.
  3. Decoupled greedy learning of cnns. In International Conference on Machine Learning, pages 736–745. PMLR.
  4. Vafl: a method of vertical asynchronous federated learning. arXiv preprint arXiv:2007.06081.
  5. Towards asynchronous federated learning for heterogeneous edge-powered internet of things. Digital Communications and Networks, 7(3):317–326.
  6. Understanding synthetic gradients and decoupled neural interfaces. In Precup, D. and Teh, Y. W., editors, Proceedings of the 34th International Conference on Machine Learning, volume 70 of Proceedings of Machine Learning Research, pages 904–912. PMLR.
  7. Davies, D. W. (1966). Proposal for a digital communication network. Unpublished memo.
  8. Contrastive hebbian learning with random feedback weights. CoRR, abs/1806.07406.
  9. Blockchain-empowered decentralized horizontal federated learning for 5g-enabled uavs. IEEE Transactions on Industrial Informatics, 18(5):3582–3592.
  10. A survey on decentralized federated learning. arXiv preprint arXiv:2308.04604.
  11. Fast federated learning in the presence of arbitrary device unavailability. Advances in Neural Information Processing Systems, 34:12052–12064.
  12. Gossip learning as a decentralized alternative to federated learning. In Distributed Applications and Interoperable Systems: 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Kongens Lyngby, Denmark, June 17–21, 2019, Proceedings 19, pages 74–90. Springer.
  13. Hinton, G. (2022). The forward-forward algorithm: Some preliminary investigations. arXiv preprint arXiv:2212.13345.
  14. Dual propagation: Accelerating contrastive hebbian learning with dyadic neurons.
  15. Starcraftimage: A dataset for prototyping spatial reasoning methods for multi-agent environments. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 22004–22013.
  16. Peer-to-peer federated learning on graphs. arXiv preprint arXiv:1901.11173.
  17. Efficient asynchronous vertical federated learning via gradient prediction and double-end sparse compression. In 2020 16th international conference on control, automation, robotics and vision (ICARCV), pages 291–296. IEEE.
  18. Fedvs: Straggler-resilient and privacy-preserving vertical federated learning for split models. arXiv preprint arXiv:2304.13407.
  19. Blockchain-enabled asynchronous federated learning in edge computing. Sensors, 21(10):3335.
  20. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR.
  21. Movellan, J. R. (1991). Contrastive hebbian learning in the continuous hopfield model. In Touretzky, D. S., Elman, J. L., Sejnowski, T. J., and Hinton, G. E., editors, Connectionist Models, pages 10–17. Morgan Kaufmann.
  22. Towards flexible device participation in federated learning. In International Conference on Artificial Intelligence and Statistics, pages 3403–3411. PMLR.
  23. The graph neural network model. IEEE transactions on neural networks, 20(1):61–80.
  24. Gossipfl: A decentralized federated learning framework with sparsified and adaptive communication. IEEE Transactions on Parallel and Distributed Systems, 34(3):909–922.
  25. Applicability of wireless sensor networks in precision agriculture: A review. Wireless Personal Communications, 107:471–512.
  26. Asynchronous federated learning with reduced number of rounds and with differential privacy from less aggregated gaussian noise. arXiv preprint arXiv:2007.09208.
  27. A unified analysis of federated learning with arbitrary client participation. Advances in Neural Information Processing Systems, 35:19124–19137.
  28. Practical vertical federated learning with unsupervised representation learning. IEEE Transactions on Big Data.
  29. Secure bilevel asynchronous vertical federated learning with backward updating. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 10896–10904.

Summary

We haven't generated a summary for this paper yet.