Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 150 tok/s
Gemini 2.5 Pro 42 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 21 tok/s Pro
GPT-4o 87 tok/s Pro
Kimi K2 195 tok/s Pro
GPT OSS 120B 443 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Classifier Clustering and Feature Alignment for Federated Learning under Distributed Concept Drift (2410.18478v1)

Published 24 Oct 2024 in cs.LG

Abstract: Data heterogeneity is one of the key challenges in federated learning, and many efforts have been devoted to tackling this problem. However, distributed concept drift with data heterogeneity, where clients may additionally experience different concept drifts, is a largely unexplored area. In this work, we focus on real drift, where the conditional distribution $P(Y|X)$ changes. We first study how distributed concept drift affects the model training and find that local classifier plays a critical role in drift adaptation. Moreover, to address data heterogeneity, we study the feature alignment under distributed concept drift, and find two factors that are crucial for feature alignment: the conditional distribution $P(Y|X)$ and the degree of data heterogeneity. Motivated by the above findings, we propose FedCCFA, a federated learning framework with classifier clustering and feature alignment. To enhance collaboration under distributed concept drift, FedCCFA clusters local classifiers at class-level and generates clustered feature anchors according to the clustering results. Assisted by these anchors, FedCCFA adaptively aligns clients' feature spaces based on the entropy of label distribution $P(Y)$, alleviating the inconsistency in feature space. Our results demonstrate that FedCCFA significantly outperforms existing methods under various concept drift settings. Code is available at https://github.com/Chen-Junbao/FedCCFA.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Federated learning based on dynamic regularization. In the 9th International Conference on Learning Representations, 2021.
  2. Adaptive federated learning in presence of concept drift. In 2021 International Joint Conference on Neural Networks (IJCNN), pages 1–7. IEEE, 2021.
  3. Concept drift detection and adaptation for federated and continual learning. Multimedia Tools and Applications, pages 1–23, 2022.
  4. An adaptive framework for multistream classification. In Proceedings of the 25th ACM international on conference on information and knowledge management, pages 1181–1190, 2016.
  5. Asynchronous federated learning for sensor data with concept drift. In 2021 IEEE International Conference on Big Data (Big Data), pages 4822–4831. IEEE, 2021.
  6. Exploiting shared representations for personalized federated learning. In Proceedings of the 38th International Conference on Machine Learning, pages 2089–2099, 2021.
  7. Cinic-10 is not imagenet or cifar-10. arXiv preprint arXiv:1810.03505, 2018.
  8. Federated learning for predicting clinical outcomes in patients with covid-19. Nature medicine, 27(10):1735–1743, 2021.
  9. Fedgroup: Efficient clustered federated learning via decomposed data-driven measure. arXiv preprint arXiv:2010.06870, 2020.
  10. Flexible clustered federated learning for client-level data distribution shift. IEEE Transactions on Parallel and Distributed Systems, 33(11):2661–2674, 2021.
  11. A survey on concept drift adaptation. ACM computing surveys (CSUR), 46(4):1–37, 2014.
  12. Saccos: A semi-supervised framework for emerging class detection and concept drift adaption over data streams. IEEE Transactions on Knowledge and Data Engineering, 34:1416–1426, 2022.
  13. An efficient framework for clustered federated learning. Advances in Neural Information Processing Systems, 33:19586–19597, 2020.
  14. Towards federated learning on time-evolving heterogeneous data. arXiv preprint arXiv:2112.13246, 2021.
  15. Fusion: An online method for multistream classification. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pages 919–928, 2017.
  16. Federated learning for mobile keyboard prediction. arXiv preprint arXiv:1811.03604, 2018.
  17. Measuring the effects of non-identical data distribution for federated visual classification. arXiv preprint arXiv:1909.06335, 2019.
  18. Fedexp: Speeding up federated averaging via extrapolation. In the 11th International Conference on Learning Representations, 2023.
  19. Federated learning under distributed concept drift. In Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, pages 5834–5853. PMLR, 2023.
  20. SCAFFOLD: Stochastic controlled averaging for federated learning. In Proceedings of the 37th International Conference on Machine Learning, pages 5132–5143, 2020.
  21. Learning multiple layers of features from tiny images. Technical report, University of Toronto, 2009.
  22. Model-contrastive federated learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 10713–10722, 2021.
  23. Ditto: Fair and robust federated learning through personalization. In International conference on machine learning, pages 6357–6368, 2021.
  24. Federated optimization in heterogeneous networks. Proceedings of Machine learning and systems, 2:429–450, 2020.
  25. Ddg-da: Data distribution generation for predictable concept drift adaptation. In Proceedings of the 36th AAAI Conference on Artificial Intelligence, volume 36, pages 4092–4100, 2022.
  26. Learning under concept drift: A review. IEEE transactions on knowledge and data engineering, 31(12):2346–2363, 2018.
  27. Three approaches for personalization with applications to federated learning. arXiv preprint arXiv:2002.10619, 2020.
  28. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282, 2017.
  29. Local learning matters: Rethinking data heterogeneity in federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 8397–8406, 2022.
  30. Federated-learning-based anomaly detection for iot security attacks. IEEE Internet of Things Journal, 9(4):2545–2554, 2021.
  31. FedBABU: Toward enhanced representation for federated image classification. In the 10th International Conference on Learning Representations, 2022.
  32. Flash: Concept drift adaptation in federated learning. In Proceedings of the 40th International Conference on Machine Learning, pages 26931–26962, 2023.
  33. Adaptive federated optimization. In the 9th International Conference on Learning Representations, 2021.
  34. On perfect clustering of high dimension, low sample size data. IEEE transactions on pattern analysis and machine intelligence, 42(9):2257–2272, 2019.
  35. Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints. IEEE transactions on neural networks and learning systems, 32(8):3710–3722, 2020.
  36. Understanding and mitigating dimensional collapse in federated learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 46(5):2936–2949, 2024.
  37. Personalized federated learning with moreau envelopes. Advances in Neural Information Processing Systems, 33:21394–21405, 2020.
  38. Driftsurf: Stable-state / reactive-state learning under concept drift. In Proceedings of the 38th International Conference on Machine Learning, pages 10054–10064, 2021.
  39. Federated learning with matched averaging. In the 8th International Conference on Learning Representations, 2020.
  40. Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in neural information processing systems, 33:7611–7623, 2020.
  41. Slowmo: Improving communication-efficient distributed sgd with slow momentum. arXiv preprint arXiv:1910.00643, 2019.
  42. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747, 2017.
  43. Personalized federated learning with feature alignment and classifier collaboration. In the 11th International Conference on Learning Representations, 2023.
  44. Federated learning for healthcare informatics. Journal of healthcare informatics research, 5:1–19, 2021.
  45. Fedfm: Anchor-based feature matching for data heterogeneity in federated learning. IEEE Transactions on Signal Processing, 71:4224–4239, 2023.
  46. Online boosting adaptive learning under concept drift for multistream classification. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 38, pages 16522–16530, 2024.
  47. Bayesian nonparametric federated learning of neural networks. In International conference on machine learning, pages 7252–7261, 2019.
  48. Federated learning for non-iid data via unified feature learning and optimization objective alignment. In Proceedings of the IEEE/CVF international conference on computer vision, pages 4420–4428, 2021.
  49. Personalized federated learning with first order model optimization. In International Conference on Learning Representations, 2021.
  50. Fedfa: Federated learning with feature anchors to align features and classifiers for heterogeneous data. IEEE Transactions on Mobile Computing, 2023.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Questions

We haven't generated a list of open questions mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: