Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-Participant Multi-Class Vertical Federated Learning (2001.11154v1)

Published 30 Jan 2020 in cs.LG and stat.ML

Abstract: Federated learning (FL) is a privacy-preserving paradigm for training collective machine learning models with locally stored data from multiple participants. Vertical federated learning (VFL) deals with the case where participants sharing the same sample ID space but having different feature spaces, while label information is owned by one participant. Current studies of VFL only support two participants, and mostly focus on binaryclass logistic regression problems. In this paper, we propose the Multi-participant Multi-class Vertical Federated Learning (MMVFL) framework for multi-class VFL problems involving multiple parties. Extending the idea of multi-view learning (MVL), MMVFL enables label sharing from its owner to other VFL participants in a privacypreserving manner. To demonstrate the effectiveness of MMVFL, a feature selection scheme is incorporated into MMVFL to compare its performance against supervised feature selection and MVL-based approaches. Experiment results on real-world datasets show that MMVFL can effectively share label information among multiple VFL participants and match multi-class classification performance of existing approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Siwei Feng (13 papers)
  2. Han Yu (218 papers)
Citations (82)