Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fault Tolerant Serverless VFL Over Dynamic Device Environment (2312.16638v2)

Published 27 Dec 2023 in cs.LG

Abstract: Vertical Federated learning (VFL) is a class of FL where each client shares the same set of samples but only owns a subset of the features. Usually, VFL assumes perfect hardware and communication capabilities. However, this assumption hinders the broad deployment of VFL, particularly on a network of edge devices, which are heterogeneous in their in-situ capabilities while any device may connect/disconnect from the network over time. To address this gap, we study the test time performance of VFL under dynamic network conditions, which we call DN-VFL. We first formalize DN-VFL, including a message passing distributed inference algorithm, the corresponding risk, and a serverless setup. We develop a novel DN-VFL approach called Multiple Aggregation with Gossip Rounds and Simulated Faults (MAGS) that synthesizes replication, gossiping, and selective feature omission to improve performance significantly over baselines. Furthermore, we propose metrics and extensively analyze MAGS using a simulated sensor network. The results show that naively using VFL for DN-VFL is not the best approach. Rather, MAGS present a better alternative to handle changes in the network during inference.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. Baran, P. (1964). On Distributed Communications: I. Introduction to Distributed Communications Networks. RAND Corporation, Santa Monica, CA.
  2. Greedy layerwise learning can scale to imagenet. In International conference on machine learning, pages 583–593. PMLR.
  3. Decoupled greedy learning of cnns. In International Conference on Machine Learning, pages 736–745. PMLR.
  4. Vafl: a method of vertical asynchronous federated learning. arXiv preprint arXiv:2007.06081.
  5. Towards asynchronous federated learning for heterogeneous edge-powered internet of things. Digital Communications and Networks, 7(3):317–326.
  6. Understanding synthetic gradients and decoupled neural interfaces. In Precup, D. and Teh, Y. W., editors, Proceedings of the 34th International Conference on Machine Learning, volume 70 of Proceedings of Machine Learning Research, pages 904–912. PMLR.
  7. Davies, D. W. (1966). Proposal for a digital communication network. Unpublished memo.
  8. Contrastive hebbian learning with random feedback weights. CoRR, abs/1806.07406.
  9. Blockchain-empowered decentralized horizontal federated learning for 5g-enabled uavs. IEEE Transactions on Industrial Informatics, 18(5):3582–3592.
  10. A survey on decentralized federated learning. arXiv preprint arXiv:2308.04604.
  11. Fast federated learning in the presence of arbitrary device unavailability. Advances in Neural Information Processing Systems, 34:12052–12064.
  12. Gossip learning as a decentralized alternative to federated learning. In Distributed Applications and Interoperable Systems: 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Kongens Lyngby, Denmark, June 17–21, 2019, Proceedings 19, pages 74–90. Springer.
  13. Hinton, G. (2022). The forward-forward algorithm: Some preliminary investigations. arXiv preprint arXiv:2212.13345.
  14. Dual propagation: Accelerating contrastive hebbian learning with dyadic neurons.
  15. Starcraftimage: A dataset for prototyping spatial reasoning methods for multi-agent environments. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 22004–22013.
  16. Peer-to-peer federated learning on graphs. arXiv preprint arXiv:1901.11173.
  17. Efficient asynchronous vertical federated learning via gradient prediction and double-end sparse compression. In 2020 16th international conference on control, automation, robotics and vision (ICARCV), pages 291–296. IEEE.
  18. Fedvs: Straggler-resilient and privacy-preserving vertical federated learning for split models. arXiv preprint arXiv:2304.13407.
  19. Blockchain-enabled asynchronous federated learning in edge computing. Sensors, 21(10):3335.
  20. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR.
  21. Movellan, J. R. (1991). Contrastive hebbian learning in the continuous hopfield model. In Touretzky, D. S., Elman, J. L., Sejnowski, T. J., and Hinton, G. E., editors, Connectionist Models, pages 10–17. Morgan Kaufmann.
  22. Towards flexible device participation in federated learning. In International Conference on Artificial Intelligence and Statistics, pages 3403–3411. PMLR.
  23. The graph neural network model. IEEE transactions on neural networks, 20(1):61–80.
  24. Gossipfl: A decentralized federated learning framework with sparsified and adaptive communication. IEEE Transactions on Parallel and Distributed Systems, 34(3):909–922.
  25. Applicability of wireless sensor networks in precision agriculture: A review. Wireless Personal Communications, 107:471–512.
  26. Asynchronous federated learning with reduced number of rounds and with differential privacy from less aggregated gaussian noise. arXiv preprint arXiv:2007.09208.
  27. A unified analysis of federated learning with arbitrary client participation. Advances in Neural Information Processing Systems, 35:19124–19137.
  28. Practical vertical federated learning with unsupervised representation learning. IEEE Transactions on Big Data.
  29. Secure bilevel asynchronous vertical federated learning with backward updating. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 10896–10904.

Summary

We haven't generated a summary for this paper yet.