Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

EFMVFL: An Efficient and Flexible Multi-party Vertical Federated Learning without a Third Party (2201.06244v1)

Published 17 Jan 2022 in cs.LG and cs.CR

Abstract: Federated learning allows multiple participants to conduct joint modeling without disclosing their local data. Vertical federated learning (VFL) handles the situation where participants share the same ID space and different feature spaces. In most VFL frameworks, to protect the security and privacy of the participants' local data, a third party is needed to generate homomorphic encryption key pairs and perform decryption operations. In this way, the third party is granted the right to decrypt information related to model parameters. However, it isn't easy to find such a credible entity in the real world. Existing methods for solving this problem are either communication-intensive or unsuitable for multi-party scenarios. By combining secret sharing and homomorphic encryption, we propose a novel VFL framework without a third party called EFMVFL, which supports flexible expansion to multiple participants with low communication overhead and is applicable to generalized linear models. We give instantiations of our framework under logistic regression and Poisson regression. Theoretical analysis and experiments show that our framework is secure, more efficient, and easy to be extended to multiple participants.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yimin Huang (17 papers)
  2. Xinyu Feng (7 papers)
  3. Wanwan Wang (6 papers)
  4. Hao He (99 papers)
  5. Yukun Wang (21 papers)
  6. Ming Yao (5 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.