Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exponential Convergence for Distributed Smooth Optimization Under the Restricted Secant Inequality Condition (1909.03282v1)

Published 7 Sep 2019 in math.OC

Abstract: This paper considers the distributed smooth optimization problem in which the objective is to minimize a global cost function formed by a sum of local smooth cost functions, by using local information exchange. The standard assumption for proving exponential/linear convergence of first-order methods is the strong convexity of the cost functions, which does not hold for many practical applications. In this paper, we first show that the continuous-time distributed primal-dual gradient algorithm converges to one global minimizer exponentially under the assumption that the global cost function satisfies the restricted secant inequality condition. This condition is weaker than the strong convexity condition since it does not require convexity and the global minimizers are not necessary to be unique. We then show that the discrete-time distributed primal-dual algorithm constructed by using the Euler's approximation method converges to one global minimizer linearly under the same condition. The theoretical results are illustrated by numerical simulations.

Summary

We haven't generated a summary for this paper yet.