Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerated Distributed Projected Gradient Descent for Convex Optimization with Clique-wise Coupled Constraints (2211.06284v2)

Published 11 Nov 2022 in math.OC, cs.SY, and eess.SY

Abstract: This paper addresses a distributed convex optimization problem with a class of coupled constraints, which arise in a multi-agent system composed of multiple communities modeled by cliques. First, we propose a fully distributed gradient-based algorithm with a novel operator inspired by the convex projection, called the clique-based projection. Next, we scrutinize the convergence properties for both diminishing and fixed step sizes. For diminishing ones, we show the convergence to an optimal solution under the assumptions of the smoothness of an objective function and the compactness of the constraint set. Additionally, when the objective function is strongly monotone, the strict convergence to the unique solution is proved without the assumption of compactness. For fixed step sizes, we prove the non-ergodic convergence rate of O(1/k) concerning the objective residual under the assumption of the smoothness of the objective function. Furthermore, we apply Nesterov's acceleration method to the proposed algorithm and establish the convergence rate of O(1/k2). Numerical experiments illustrate the effectiveness of the proposed method.

Citations (2)

Summary

We haven't generated a summary for this paper yet.