Bayesian Neural Network For Personalized Federated Learning Parameter Selection (2402.16091v1)
Abstract: Federated learning's poor performance in the presence of heterogeneous data remains one of the most pressing issues in the field. Personalized federated learning departs from the conventional paradigm in which all clients employ the same model, instead striving to discover an individualized model for each client to address the heterogeneity in the data. One of such approach involves personalizing specific layers of neural networks. However, prior endeavors have not provided a dependable rationale, and some have selected personalized layers that are entirely distinct and conflicting. In this work, we take a step further by proposing personalization at the elemental level, rather than the traditional layer-level personalization. To select personalized parameters, we introduce Bayesian neural networks and rely on the uncertainty they offer to guide our selection of personalized parameters. Finally, we validate our algorithm's efficacy on several real-world datasets, demonstrating that our proposed approach outperforms existing baselines.
- “Communication-efficient learning of deep networks from decentralized data,” in Artificial Intelligence and Statistics. PMLR, 2017, pp. 1273–1282.
- “Federated learning with non-iid data,” arXiv preprint arXiv:1806.00582, 2018.
- “Federated optimization in heterogeneous networks,” Proceedings of Machine Learning and Systems, vol. 2, pp. 429–450, 2020.
- “Scaffold: Stochastic controlled averaging for federated learning,” in International Conference on Machine Learning. PMLR, 2020, pp. 5132–5143.
- “Fedproto: Federated prototype learning across heterogeneous clients,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2022, vol. 36, pp. 8432–8440.
- “Personalized federated learning with feature alignment and classifier collaboration,” International Conference on Learning Representations, 2023.
- “Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints,” IEEE Transactions on Neural Networks and Learning Systems, vol. 32, no. 8, pp. 3710–3722, 2020.
- “Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach,” Advances in Neural Information Processing Systems, vol. 33, pp. 3557–3568, 2020.
- “Debiasing model updates for improving personalized federated training,” in International Conference on Machine Learning. PMLR, 2021, pp. 21–31.
- “Federated learning with personalization layers,” arXiv preprint arXiv:1912.00818, 2019.
- “Think locally, act globally: Federated learning with local and global representations,” arXiv preprint arXiv:2001.01523, 2020.
- Alex Graves, “Practical variational inference for neural networks,” Advances in Neural Information Processing Systems, vol. 24, 2011.
- “Weight uncertainty in neural network,” in International Conference on Machine Learning. PMLR, 2015, pp. 1613–1622.
- “Efficient and scalable bayesian neural nets with rank-1 factors,” in International Conference on Machine Learning. PMLR, 2020, pp. 2782–2792.
- “Bayesian learning via stochastic gradient langevin dynamics,” in Proceedings of the 28th International Conference on Machine Learning, 2011, pp. 681–688.
- “Cyclical stochastic gradient mcmc for bayesian deep learning,” International Conference on Learning Representations, 2020.
- “Federated stochastic gradient langevin dynamics,” in Uncertainty in Artificial Intelligence. PMLR, 2021, pp. 1703–1712.
- “Laplace redux-effortless bayesian deep learning,” Advances in Neural Information Processing Systems, vol. 34, pp. 20089–20103, 2021.
- “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, 1998.
- “Deep residual learning for image recognition,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 770–778.