Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Vertical Federated Learning: Concepts, Advances and Challenges (2211.12814v4)

Published 23 Nov 2022 in cs.LG, cs.AI, cs.CR, and cs.DC

Abstract: Vertical Federated Learning (VFL) is a federated learning setting where multiple parties with different features about the same set of users jointly train machine learning models without exposing their raw data or model parameters. Motivated by the rapid growth in VFL research and real-world applications, we provide a comprehensive review of the concept and algorithms of VFL, as well as current advances and challenges in various aspects, including effectiveness, efficiency, and privacy. We provide an exhaustive categorization for VFL settings and privacy-preserving protocols and comprehensively analyze the privacy attacks and defense strategies for each protocol. In the end, we propose a unified framework, termed VFLow, which considers the VFL problem under communication, computation, privacy, as well as effectiveness and fairness constraints. Finally, we review the most recent advances in industrial applications, highlighting open challenges and future directions for VFL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yang Liu (2253 papers)
  2. Yan Kang (49 papers)
  3. Tianyuan Zou (8 papers)
  4. Yanhong Pu (1 paper)
  5. Yuanqin He (9 papers)
  6. Xiaozhou Ye (18 papers)
  7. Ye Ouyang (16 papers)
  8. Ya-Qin Zhang (45 papers)
  9. Qiang Yang (202 papers)
Citations (99)

Summary

We haven't generated a summary for this paper yet.