Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Convergence of Momentum-Based Algorithms for Federated Bilevel Optimization Problems (2204.13299v2)

Published 28 Apr 2022 in cs.LG

Abstract: In this paper, we studied the federated bilevel optimization problem, which has widespread applications in machine learning. In particular, we developed two momentum-based algorithms for optimizing this kind of problem and established the convergence rate of our two algorithms, providing the sample and communication complexities. Importantly, to the best of our knowledge, our convergence rate is the first one achieving the linear speedup with respect to the number of devices for federated bilevel optimization algorithms. At last, our extensive experimental results confirm the effectiveness of our two algorithms.

Citations (1)

Summary

We haven't generated a summary for this paper yet.