Papers
Topics
Authors
Recent
2000 character limit reached

Resilient Federated Learning under Byzantine Attack in Distributed Nonconvex Optimization with 2-f Redundancy (2312.10189v1)

Published 15 Dec 2023 in math.OC

Abstract: We study the problem of Byzantine fault tolerance in a distributed optimization setting, where there is a group of $N$ agents communicating with a trusted centralized coordinator. Among these agents, there is a subset of $f$ agents that may not follow a prescribed algorithm and may share arbitrarily incorrect information with the coordinator. The goal is to find the optimizer of the aggregate cost functions of the honest agents. We will be interested in studying the local gradient descent method, also known as federated learning, to solve this problem. However, this method often returns an approximate value of the underlying optimal solution in the Byzantine setting. Recent work showed that by incorporating the so-called comparative elimination (CE) filter at the coordinator, one can provably mitigate the detrimental impact of Byzantine agents and precisely compute the true optimizer in the convex setting. The focus of the present work is to provide theoretical results to show the convergence of local gradient methods with the CE filter in a nonconvex setting. We will also provide a number of numerical simulations to support our theoretical results.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.