Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 173 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 37 tok/s Pro
GPT-5 High 38 tok/s Pro
GPT-4o 124 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler (2405.15458v2)

Published 24 May 2024 in cs.LG and cs.DC

Abstract: Federated learning (FL) enables collaborative machine learning across distributed data owners, but data heterogeneity poses a challenge for model calibration. While prior work focused on improving accuracy for non-iid data, calibration remains under-explored. This study reveals existing FL aggregation approaches lead to sub-optimal calibration, and theoretical analysis shows despite constraining variance in clients' label distributions, global calibration error is still asymptotically lower bounded. To address this, we propose a novel Federated Calibration (FedCal) approach, emphasizing both local and global calibration. It leverages client-specific scalers for local calibration to effectively correct output misalignment without sacrificing prediction accuracy. These scalers are then aggregated via weight averaging to generate a global scaler, minimizing the global calibration error. Extensive experiments demonstrate FedCal significantly outperforms the best-performing baseline, reducing global calibration error by 47.66% on average.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (63)
  1. Deep ensembles work, but are they necessary? Advances in Neural Information Processing Systems, 35:33646–33660, 2022.
  2. Personalized federated learning with gaussian processes. Advances in Neural Information Processing Systems, 34:8392–8406, 2021.
  3. Git re-basin: Merging models modulo permutation symmetries. arXiv preprint arXiv:2209.04836, 2022.
  4. Learn then test: Calibrating predictive algorithms to achieve risk control. arXiv preprint arXiv:2110.01052, 2021.
  5. Federated learning with personalization layers. arXiv preprint arXiv:1912.00818, 2019.
  6. Convergence and accuracy trade-offs in federated learning and meta-learning. In International Conference on Artificial Intelligence and Statistics, pp.  2575–2583. PMLR, 2021.
  7. Federated learning for privacy-preserving: A review of pii data analysis in fintech. International Journal of Software Engineering & Applications (IJSEA), 13(4), 2022.
  8. Federated learning for predicting clinical outcomes in patients with covid-19. Nature medicine, 27(10):1735–1743, 2021.
  9. Deng, L. The mnist database of handwritten digit images for machine learning research. IEEE Signal Processing Magazine, 29(6):141–142, 2012.
  10. Adaptive personalized federated learning. arXiv preprint arXiv:2003.13461, 2020.
  11. Federated learning for vehicular internet of things: Recent advances and open issues. IEEE Open Journal of the Computer Society, 1:45–61, 2020.
  12. The role of permutation invariance in linear mode connectivity of neural networks. arXiv preprint arXiv:2110.06296, 2021.
  13. On calibration of modern neural networks. In International conference on machine learning, pp.  1321–1330. PMLR, 2017a.
  14. On calibration of modern neural networks. In International conference on machine learning, pp.  1321–1330. PMLR, 2017b.
  15. Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption. arXiv preprint arXiv:1711.10677, 2017.
  16. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  770–778, 2016.
  17. Using self-supervised learning can improve model robustness and uncertainty. Advances in neural information processing systems, 32, 2019.
  18. The non-iid data quagmire of decentralized machine learning. In International Conference on Machine Learning, pp.  4387–4398. PMLR, 2020.
  19. Advances and open problems in federated learning. Foundations and Trends® in Machine Learning, 14(1–2):1–210, 2021.
  20. Scaffold: Stochastic controlled averaging for federated learning. In International conference on machine learning, pp.  5132–5143. PMLR, 2020.
  21. Sde-net: Equipping deep neural networks with uncertainty estimates. arXiv preprint arXiv:2008.10546, 2020.
  22. Learning multiple layers of features from tiny images. Technical report, University of Toronto, Canada, 2009.
  23. Calibration of encoder decoder models for neural machine translation. arXiv preprint arXiv:1903.00802, 2019.
  24. Verified uncertainty calibration. Advances in Neural Information Processing Systems, 32, 2019.
  25. Simple and scalable predictive uncertainty estimation using deep ensembles. Advances in neural information processing systems, 30, 2017.
  26. Why m heads are better than one: Training a diverse ensemble of deep networks. arXiv preprint arXiv:1511.06314, 2015.
  27. Practical federated gradient boosting decision trees. In Proceedings of the AAAI conference on artificial intelligence, volume 34, pp.  4642–4649, 2020a.
  28. Federated optimization in heterogeneous networks. Proceedings of Machine learning and systems, 2:429–450, 2020b.
  29. Privacy-preserved federated learning for autonomous driving. IEEE Transactions on Intelligent Transportation Systems, 23(7):8423–8434, 2021.
  30. Federated graph neural networks: Overview, techniques, and challenges. IEEE Transactions on Neural Networks and Learning Systems, 2024.
  31. Federated learning for open banking. In Federated Learning: Privacy and Incentive, pp.  240–254. Springer, 2020.
  32. Distribution-free federated learning with conformal predictions. arXiv preprint arXiv:2110.07661, 2021.
  33. Real-world image datasets for federated learning. arXiv preprint arXiv:1910.11089, 2019.
  34. No fear of heterogeneity: Classifier calibration for federated learning with non-iid data. Advances in Neural Information Processing Systems, 34:5972–5984, 2021.
  35. Privacy and robustness in federated learning: Attacks and defenses. IEEE transactions on neural networks and learning systems, 2022.
  36. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pp.  1273–1282. PMLR, 2017.
  37. Obtaining well calibrated probabilities using bayesian binning. In Proceedings of the AAAI conference on artificial intelligence, volume 29, 2015.
  38. Reading digits in natural images with unsupervised feature learning. In NIPS Workshop on Deep Learning and Unsupervised Feature Learning 2011, 2011. URL http://ufldl.stanford.edu/housenumbers/nips2011_housenumbers.pdf.
  39. When are solutions connected in deep networks? Advances in Neural Information Processing Systems, 34:20956–20969, 2021.
  40. Can you trust your model’s uncertainty? evaluating predictive uncertainty under dataset shift. Advances in neural information processing systems, 32, 2019.
  41. Conformal prediction for federated uncertainty quantification under label shift. In Krause, A., Brunskill, E., Cho, K., Engelhardt, B., Sabato, S., and Scarlett, J. (eds.), Proceedings of the 40th International Conference on Machine Learning, volume 202 of Proceedings of Machine Learning Research, pp.  27907–27947. PMLR, 23–29 Jul 2023. URL https://proceedings.mlr.press/v202/plassier23a.html.
  42. Platt, J. et al. Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Advances in large margin classifiers, 10(3):61–74, 1999.
  43. Rahaman, R. et al. Uncertainty quantification and deep ensembles. Advances in Neural Information Processing Systems, 34:20063–20075, 2021.
  44. Intra order-preserving functions for calibration of multi-class neural networks. Advances in Neural Information Processing Systems, 33:13456–13467, 2020.
  45. Adaptive federated optimization. arXiv preprint arXiv:2003.00295, 2020.
  46. The future of digital health with federated learning. NPJ digital medicine, 3(1):119, 2020.
  47. Federated learning in medicine: facilitating multi-institutional collaborations without sharing patient data. Scientific reports, 10(1):12598, 2020.
  48. Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems, 34(12):9587–9603, 2023.
  49. On mixup training: Improved calibration and predictive uncertainty for deep neural networks. Advances in Neural Information Processing Systems, 32, 2019.
  50. Algorithmic learning in a random world, volume 29. Springer, 2005.
  51. Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in neural information processing systems, 33:7611–7623, 2020.
  52. Addressing class imbalance in federated learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pp.  10165–10173, 2021.
  53. Federated learning with differential privacy: Algorithms and performance analysis. IEEE Transactions on Information Forensics and Security, 15:3454–3469, 2020.
  54. Should ensemble members be calibrated? arXiv preprint arXiv:2101.05397, 2021.
  55. Robust calibration with multi-domain temperature scaling. Advances in Neural Information Processing Systems, 35:27510–27523, 2022.
  56. Bayesian nonparametric federated learning of neural networks. In International conference on machine learning, pp.  7252–7261. PMLR, 2019.
  57. Obtaining calibrated probability estimates from decision trees and naive bayesian classifiers. In Icml, volume 1, pp.  609–616, 2001.
  58. Transforming classifier scores into accurate multiclass probability estimates. In Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining, pp.  694–699, 2002.
  59. {{\{{BatchCrypt}}\}}: Efficient homomorphic encryption for {{\{{Cross-Silo}}\}} federated learning. In 2020 USENIX annual technical conference (USENIX ATC 20), pp.  493–506, 2020.
  60. Federated learning with label distribution skew via logits calibration. In International Conference on Machine Learning, pp.  26311–26329. PMLR, 2022a.
  61. Federated learning with label distribution skew via logits calibration. In International Conference on Machine Learning, pp.  26311–26329. PMLR, 2022b.
  62. Calibrating predictions to decisions: A novel approach to multi-class calibration. Advances in Neural Information Processing Systems, 34:22313–22324, 2021.
  63. Federated learning with non-iid data. arXiv preprint arXiv:1806.00582, 2018.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 2 tweets and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: