Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Budgeted Online Model Selection and Fine-Tuning via Federated Learning (2401.10478v1)

Published 19 Jan 2024 in cs.LG

Abstract: Online model selection involves selecting a model from a set of candidate models 'on the fly' to perform prediction on a stream of data. The choice of candidate models henceforth has a crucial impact on the performance. Although employing a larger set of candidate models naturally leads to more flexibility in model selection, this may be infeasible in cases where prediction tasks are performed on edge devices with limited memory. Faced with this challenge, the present paper proposes an online federated model selection framework where a group of learners (clients) interacts with a server with sufficient memory such that the server stores all candidate models. However, each client only chooses to store a subset of models that can be fit into its memory and performs its own prediction task using one of the stored models. Furthermore, employing the proposed algorithm, clients and the server collaborate to fine-tune models to adapt them to a non-stationary environment. Theoretical analysis proves that the proposed algorithm enjoys sub-linear regret with respect to the best model in hindsight. Experiments on real datasets demonstrate the effectiveness of the proposed algorithm.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (65)
  1. Debiasing model updates for improving personalized federated training. In Proceedings of International Conference on Machine Learning, volume 139, pp.  21–31, Jul 2021.
  2. Online learning with feedback graphs: Beyond bandits. In Proceedings of Conference on Learning Theory, volume 40, pp.  23–35, Jul 2015.
  3. Budgeted prediction with expert advice. In AAAI Conference on Artificial Intelligence, Feb 2015.
  4. The nonstochastic multiarmed bandit problem. SIAM Journal on Computing, 32(1):48–77, Jan 2003.
  5. Diverse client selection for federated learning via submodular maximization. In International Conference on Learning Representations, 2022.
  6. Best model identification: A rested bandit formulation. In Proceedings of International Conference on Machine Learning, volume 139, pp.  1362–1372, Jul 2021.
  7. Prediction, Learning, and Games. Cambridge University Press, USA, 2006.
  8. Lag: Lazily aggregated gradient for communication-efficient distributed learning. In Proceedings of International Conference on Neural Information Processing Systems, pp.  5055–5065, 2018.
  9. Asynchronous online federated learning for edge devices with non-iid data. In IEEE International Conference on Big Data (Big Data), pp.  15–24, Dec 2020.
  10. Approximation and online algorithms for multidimensional bin packing: A survey. Computer Science Review, 24:63–79, May 2017.
  11. Online learning with feedback graphs without the graphs. In Proceedings of International Conference on Machine Learning, pp.  811–819, Jun 2016.
  12. Exploiting shared representations for personalized federated learning. In Proceedings of the International Conference on Machine Learning, volume 139, pp.  2089–2099, Jul 2021.
  13. Online learning with dependent stochastic feedback graphs. In Proceedings of International Conference on Machine Learning, Jul 2020.
  14. Model selection for production system via automated online experiments. In Proceedings of International Conference on Neural Information Processing Systems, volume 33, pp.  1106–1116, 2020.
  15. Fleet: Online federated learning via staleness awareness and performance prediction. ACM Transactions on Intelligent Systems and Technology, 13(5), Sep 2022.
  16. W. Fernandez de la Vega and George S. Lueker. Bin packing can be solved within 1 + ϵitalic-ϵ\epsilonitalic_ϵ in linear time. Combinatorica, 1:349–355, 1981.
  17. Adaptive personalized federated learning. arXiv preprint arXiv:2003.13461, 2020.
  18. Personalized federated learning with moreau envelopes. In Proceedings of International Conference on Neural Information Processing Systems, pp.  21394–21405, Dec 2020.
  19. György Dósa. The tight bound of first fit decreasing bin-packing algorithm is ffd(i) ≤\leq≤ 11/9opt(i) + 6/9. In International Conference on Combinatorics, Algorithms, Probabilistic and Experimental Methodologies, pp.  1–11, Apr. 2007.
  20. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. In Advances in Neural Information Processing Systems, volume 33, pp.  3557–3568, Dec 2020.
  21. Model selection in reinforcement learning. Machine Learning, 85:299–332, Jun 2011.
  22. Model-agnostic meta-learning for fast adaptation of deep networks. In Proceedings of International Conference on Machine Learning, volume 70, pp.  1126–1135, Aug 2017.
  23. Parameter-free online learning via model selection. In Proceedings of International Conference on Neural Information Processing Systems, 2017.
  24. Model selection for contextual bandits. In Proceedings of International Conference on Neural Information Processing Systems, volume 32, 2019.
  25. Client selection in federated learning: Principles, challenges, and opportunities. IEEE Internet of Things Journal, 2023.
  26. Worst-case analysis of memory allocation algorithms. In ACM Symposium on Theory of Computing, pp.  143–150, 1972.
  27. Personalized online federated learning with multiple kernels. In Advances in Neural Information Processing Systems, 2022.
  28. Graph-aided online multi-kernel learning. Journal of Machine Learning Research, 24(21):1–44, 2023a.
  29. Online learning with uncertain feedback graphs. IEEE Transactions on Neural Networks and Learning Systems, pp.  1–15, 2023b.
  30. Communication-efficient online federated learning framework for nonlinear regression. In IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp.  5228–5232, 2022.
  31. FedBoost: A communication-efficient algorithm for federated learning. In Proceedings of International Conference on Machine Learning, volume 119, pp.  3973–3983, Jul 2020.
  32. Lower bounds and optimal algorithms for personalized federated learning. In Proceedings of International Conference on Neural Information Processing Systems, pp.  2304–2315, Dec 2020.
  33. Elad Hazan. Introduction to Online Convex Optimization. MIT Press, 2022.
  34. A logarithmic additive integrality gap for bin packing. In Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp.  2616–2625, 2017.
  35. Communication-efficient randomized algorithm for multi-kernel online federated learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(12):9872–9886, 2022.
  36. An efficiency-boosting client selection scheme for federated learning with fairness guarantee. IEEE Transactions on Parallel and Distributed Systems, 32(7):1552–1564, 2021.
  37. David S. Johnson. Near-optimal bin packing algorithms. PhD thesis, MIT, 1973.
  38. SCAFFOLD: Stochastic controlled averaging for federated learning. In Proceedings of International Conference on Machine Learning, volume 119, pp.  5132–5143, Jul 2020.
  39. UCI machine learning repository, 2023.
  40. Federated learning: Strategies for improving communication efficiency, 2017.
  41. Alex Krizhevsky. Learning multiple layers of features from tiny images. Technical report, 2009.
  42. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  43. Online model selection for reinforcement learning with function approximation. In Proceedings of the International Conference on Artificial Intelligence and Statistics, volume 130, pp.  3340–3348, Apr 2021.
  44. Worst-case regret analysis of computationally budgeted online kernel selection. Machine Learning, 111(3):937–976, Mar 2022.
  45. Ditto: Fair and robust federated learning through personalization. In Proceedings of International Conference on Machine Learning, volume 139, pp.  6357–6368, Jul 2021a.
  46. FedBN: Federated learning on non-IID features via local batch normalization. In International Conference on Learning Representations, 2021b.
  47. On privacy and personalization in cross-silo federated learning. In Advances in Neural Information Processing Systems, 2022.
  48. From bandits to experts: On the value of side-observations. In Proceedings of International Conference on Neural Information Processing Systems, pp.  684–692, 2011.
  49. Federated multi-task learning under a mixture of distributions. In Proceedings of International Conference on Neural Information Processing Systems, volume 34, pp.  15434–15447, Dec 2021.
  50. Online federated learning. In IEEE Conference on Decision and Control (CDC), pp.  4083–4090, Dec 2021.
  51. Best of many worlds: Robust model selection for online supervised learning. In Proceedings of the International Conference on Artificial Intelligence and Statistics, volume 89, pp.  3177–3186, Apr 2019.
  52. Universal and data-adaptive algorithms for model selection in linear contextual bandits. In International Conference on Machine Learning, pp.  16197–16222, 2022.
  53. A snapshot of the frontiers of client selection in federated learning. Transactions on Machine Learning Research, 2022. ISSN 2835-8856. Survey Certification.
  54. A detailed comparison of meta-heuristic methods for optimising wave energy converter placements. In Proceedings of the Genetic and Evolutionary Computation Conference, pp.  1318–1325, Jul 2018.
  55. Model selection in contextual stochastic bandit problems. In Advances in Neural Information Processing Systems, volume 33, pp.  10328–10337, 2020.
  56. Best of both worlds model selection. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho (eds.), Advances in Neural Information Processing Systems, 2022. URL https://openreview.net/forum?id=9-vs8BucEoo.
  57. Online active model selection for pre-trained classifiers. In Proceedings of The International Conference on Artificial Intelligence and Statistics, volume 130, pp.  307–315, Apr 2021.
  58. FetchSGD: Communication-efficient federated learning with sketching. In Proceedings of International Conference on Machine Learning, volume 119, pp.  8253–8265, Jul 2020.
  59. Personalized federated learning using hypernetworks. In Proceedings of International Conference on Machine Learning, volume 139, pp.  9489–9502, Jul 2021.
  60. Very deep convolutional networks for large-scale image recognition. In International Conference on Learning Representations, May 2015. URL http://arxiv.org/abs/1409.1556.
  61. Federated multi-task learning. In Advances in Neural Information Processing Systems, volume 30, pp.  4424–4434, Dec 2017.
  62. Online kernel selection: Algorithms and evaluations. In Proceedings of the AAAI Conference on Artificial Intelligence, pp.  1197–1203, 2012.
  63. Personalized federated learning with first order model optimization. In International Conference on Learning Representations, 2021a.
  64. Cautionary tales on air-quality improvement in beijing. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 473(2205):20170457, 2017.
  65. Regret bounds for online kernel selection in continuous kernel space. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12):10931–10938, May 2021b.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets