Papers
Topics
Authors
Recent
Search
2000 character limit reached

Cross-client Label Propagation for Transductive and Semi-Supervised Federated Learning

Published 12 Oct 2022 in cs.LG | (2210.06434v4)

Abstract: We present Cross-Client Label Propagation(XCLP), a new method for transductive federated learning. XCLP estimates a data graph jointly from the data of multiple clients and computes labels for the unlabeled data by propagating label information across the graph. To avoid clients having to share their data with anyone, XCLP employs two cryptographically secure protocols: secure Hamming distance computation and secure summation. We demonstrate two distinct applications of XCLP within federated learning. In the first, we use it in a one-shot way to predict labels for unseen test points. In the second, we use it to repeatedly pseudo-label unlabeled training data in a federated semi-supervised setting. Experiments on both real federated and standard benchmark datasets show that in both applications XCLP achieves higher classification accuracy than alternative approaches.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. A survey on homomorphic encryption schemes: Theory and implementation. ACM Computing Surveys, 51(4), 2018.
  2. Exploiting unlabeled data in smart cities using federated edge learning. In International Wireless Communications and Mobile Computing Conference (IWCMC), 2020. URL https://doi.org/10.1109/IWCMC48107.2020.9148475.
  3. Secure computation on floating point numbers. In Network and Distributed System Security Symposium (NDSS), 2013. URL https://www.ndss-symposium.org/ndss2013/secure-computation-floating-point-numbers.
  4. Practical secure aggregation for privacy-preserving machine learning. In ACM Conference on Computer and Communications Security (SIGSAC), 2017. URL https://doi.org/10.1145/3133956.3133982.
  5. SHADE: Secure HAmming DistancE computation from oblivious transfer. In International Conference on Financial Cryptography and Data Security (FC), 2013. URL https://link.springer.com/chapter/10.1007/978-3-642-41320-9_11.
  6. A theory of label propagation for subpopulation shift. In International Conference on Machine Learing (ICML), 2021. URL https://proceedings.mlr.press/v139/cai21b.html.
  7. LEAF: A benchmark for federated settings. CoRR, abs/1812.01097, 2018. URL http://arxiv.org/abs/1812.01097.
  8. Secure computation with fixed-point numbers. In Financial Cryptography and Data Security (FCDS), 2010.
  9. Self-supervision closes the gap between weak and strong supervision in histology. CoRR, abs/2012.03583, 2020. URL https://arxiv.org/abs/2012.03583.
  10. Heterogeneity for the win: One-shot federated clustering. In International Conference on Machine Learing (ICML), 2021. URL http://proceedings.mlr.press/v139/dennis21a.html.
  11. SemiFL: Communication efficient semi-supervised federated learning with unlabeled clients. In Conference on Neural Information Processing Systems (NeurIPS), 2022. URL https://arxiv.org/abs/2106.01432.
  12. Federated principal component analysis. In Conference on Neural Information Processing Systems (NeurIPS), 2020. URL https://proceedings.neurips.cc/paper/2020/hash/47a658229eb2368a99f1d032c8848542-Abstract.html.
  13. Deep residual learning for image recognition. In Conference on Computer Vision and Pattern Recognition (CVPR), 2016. URL https://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html.
  14. Combining label propagation and simple models out-performs graph neural networks. In International Conference on Learning Representations (ICLR), 2021. URL https://openreview.net/forum?id=8E1-f3VhX1o.
  15. Approximate nearest neighbors: towards removing the curse of dimensionality. In Symposium on Theory of Computing (STOC), 1998. URL https://dl.acm.org/doi/abs/10.1145/276698.276876.
  16. Label propagation for deep semi-supervised learning. In Conference on Computer Vision and Pattern Recognition (CVPR), 2019. URL http://openaccess.thecvf.com/content_CVPR_2019/html/Iscen_Label_Propagation_for_Deep_Semi-Supervised_Learning_CVPR_2019_paper.html.
  17. Federated semi-supervised learning with inter-client consistency & disjoint learning. In International Conference on Learning Representations (ICLR), 2021. URL https://openreview.net/forum?id=ce6CFXBh30h.
  18. Fhebench: Benchmarking fully homomorphic encryption schemes, 2022.
  19. Advances and open problems in federated learning. Foundations and Trends in Machine Learning, 2021. URL https://doi.org/10.1561/2200000083.
  20. SCAFFOLD: stochastic controlled averaging for federated learning. In International Conference on Machine Learing (ICML), 2020. URL http://proceedings.mlr.press/v119/karimireddy20a.html.
  21. Coconut: Co-classification with output space regularization. In British Machine Vision Conference (BMVC), 2014. URL http://www.bmva.org/bmvc/2014/papers/paper044/index.html.
  22. Alex Krizhevsky. Learning multiple layers of features from tiny images. Technical report, University of Toronto, 2009. URL https://www.cs.toronto.edu/~kriz/learning-features-2009-TR.pdf.
  23. Federated optimization in heterogeneous networks. In Machine Learning and Systems (MLSys), 2020. URL https://proceedings.mlsys.org/book/316.pdf.
  24. Rscfed: Random sampling consensus federated semi-supervised learning. In Conference on Computer Vision and Pattern Recognition (CVPR), 2022. URL https://doi.org/10.1109/CVPR52688.2022.00991.
  25. Learning to propagate labels: Transductive propagation network for few-shot learning. In International Conference on Learning Representations (ICLR), 2019. URL https://openreview.net/forum?id=SyVuRiC5K7.
  26. FedSiam: Towards adaptive federated semi-supervised learning. arXiv:2012.03292, 2020. URL https://arxiv.org/abs/2012.03292.
  27. Communication-efficient learning of deep networks from decentralized data. In International Conference on Artificial Intelligence and Statistics (AISTATS), 2017. URL http://proceedings.mlr.press/v54/mcmahan17a.html.
  28. Efficient oblivious transfer protocols. In Symposium on Discrete Algorithms (SODA), 2001.
  29. FLamby: Datasets and benchmarks for cross-silo federated learning in realistic healthcare settings, 2022. URL https://openreview.net/forum?id=GgM5DiAb6A2.
  30. Pascal Paillier. Public-key cryptosystems based on composite degree residuosity classes. In International Conference on the Theory and Application of Cryptographic Techniques (EuroCrypt), 1999. URL https://link.springer.com/content/pdf/10.1007/3-540-48910-X_16.pdf.
  31. FLAIR: federated learning annotated image repository. CoRR, abs/2207.08869, 2022. doi: 10.48550/arXiv.2207.08869. URL https://doi.org/10.48550/arXiv.2207.08869.
  32. EfficientNet: Rethinking model scaling for convolutional neural networks. In International Conference on Machine Learing (ICML), 2019. URL https://proceedings.mlr.press/v97/tan19a.html.
  33. Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results. In Conference on Neural Information Processing Systems (NeurIPS), 2017. URL https://proceedings.neurips.cc/paper/2017/hash/68053af2923e00204c3ca7c6a3150cf7-Abstract.html.
  34. Vladimir Vapnik. Estimation of Dependences Based on Empirical Data. Springer, 1982. URL https://link.springer.com/book/10.1007/0-387-34239-7.
  35. Matching networks for one shot learning. In Conference on Neural Information Processing Systems (NeurIPS), 2016. URL https://proceedings.neurips.cc/paper/2016/hash/90e1357833654983612fb05e3ec9148c-Abstract.html.
  36. Graphfl: A federated learning framework for semi-supervised node classification on graphs. CoRR, abs/2012.04187, 2020. URL https://arxiv.org/abs/2012.04187.
  37. Combining graph convolutional neural networks and label propagation. ACM Trans. Inf. Syst., 40(4):73:1–73:27, 2022. doi: 10.1145/3490478. URL https://doi.org/10.1145/3490478.
  38. Federated graph classification over non-iid graphs. In Conference on Neural Information Processing Systems (NeurIPS), pp.  18839–18852, 2021. URL https://proceedings.neurips.cc/paper/2021/hash/9c6947bd95ae487c81d4e19d3ed8cd6f-Abstract.html.
  39. Subgraph federated learning with missing neighbor generation. In Conference on Neural Information Processing Systems (NeurIPS), pp.  6671–6682, 2021a. URL https://proceedings.neurips.cc/paper/2021/hash/34adeb8e3242824038aa65460a47c29e-Abstract.html.
  40. Improving semi-supervised federated learning by reducing the gradient diversity of models. In International Conference on Big Data (Big Data), 2021b. URL https://doi.org/10.1109/BigData52589.2021.9671693.
  41. Learning with local and global consistency. In Conference on Neural Information Processing Systems (NeurIPS), 2004. URL https://proceedings.neurips.cc/paper/2003/file/87682805257e619d49b8e0dfdc14affa-Paper.pdf.
  42. Xiaojin Zhu. Semi-supervised learning with graphs. PhD thesis, Carnegie Mellon University, 2005. URL https://pages.cs.wisc.edu/~jerryzhu/pub/thesis.pdf.
  43. Learning from labeled and unlabeled data with label propagation. Technical Report CMU-CALD-02-107, School of Computer Science, Carnegie Mellon University, 2002. URL https://mlg.eng.cam.ac.uk/zoubin/papers/CMU-CALD-02-107.pdf.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.