Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 189 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 36 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 160 tok/s Pro
GPT OSS 120B 443 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Sparse Uncertainty-Informed Sampling from Federated Streaming Data (2408.17108v1)

Published 30 Aug 2024 in cs.LG and cs.CV

Abstract: We present a numerically robust, computationally efficient approach for non-I.I.D. data stream sampling in federated client systems, where resources are limited and labeled data for local model adaptation is sparse and expensive. The proposed method identifies relevant stream observations to optimize the underlying client model, given a local labeling budget, and performs instantaneous labeling decisions without relying on any memory buffering strategies. Our experiments show enhanced training batch diversity and an improved numerical robustness of the proposal compared to existing strategies over large-scale data streams, making our approach an effective and convenient solution in FL environments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (13)
  1. B. McMahan et al. Communication-efficient learning of deep networks from decentralized data. In Proc. 20th AISTAT, volume 54, pages 1273–1282, April 2017.
  2. N. Rieke. The future of digital health with federated learn. Digital Medicine, 3, 12 2020.
  3. G. Long et al. Federated learning for open banking. In Fed. Learn. - Privacy and Incentive, volume 12500 of LNCS, pages 240–254. Springer, 2020.
  4. H. Zhang et al. End-to-end federated learning for autonomous driving vehicles. In 2021 IJCNN, pages 1–8, 2021.
  5. M. Heusinger. Learning with high dimensional data and preprocessing in non-stationary environments. PhD thesis, Bielefeld University, Germany, 2023.
  6. O. Marfoq et al. Federated learning for data streams. In Proc. 26th AISTAT, volume 206, pages 8889–8924. PMLR, 25–27 Apr 2023.
  7. B. Settles. Active learning literature survey. CS Tech. Rep. 1648, WISC, 2009.
  8. A. Saran et al. Streaming active learning with deep neural networks. In Proc. 40th ICML, ICML’23. JMLR.org, 2023.
  9. M. Woodbury. Inverting modified matrices. In Mem. Rept. 42, page 4. Princeton, 1950.
  10. M. Seeger. Low rank updates for the cholesky decomposition. 2004.
  11. M. Röder et al. Crossing Domain Borders with Federated Few-Shot Adaptation. In Proc. 13th ICPRAM, pages 511–521, 2024.
  12. G. Golub et al. Matrix Computations, pages 338–341. JHU Press, 20134.
  13. A. Schulz et al. Deepview: Visualizing classification boundaries of deep neural networks as scatter plots using discriminative dimensionality reduction. In Proc., IJCAI 2020, pages 2305–2311. ijcai.org, 2020.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: