Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Computational Convergence Analysis of Distributed Gradient Tracking for Smooth Convex Optimization Using Dissipativity Theory (1810.00257v2)

Published 29 Sep 2018 in math.OC and cs.SY

Abstract: We present a computational analysis that establishes the $O(1/K)$ convergence of the distributed gradient tracking method when the objective function is smooth and convex but not strongly convex. The analysis is inspired by recent work on applying dissipativity theory to the analysis of centralized optimization algorithms, in which convergence is proved by searching for a numerical certificate consisting of a storage function and a supply rate. We derive a base supply rate that can be used to analyze distributed optimization with non-strongly convex objective functions. The base supply rate is then used to create a class of supply rates by combining with integral quadratic constraints. Provided that the class of supply rates is rich enough, a numerical certificate of convergence can be automatically generated following a standard procedure that involves solving a linear matrix inequality. Our computational analysis is found capable of certifying convergence under a broader range of step sizes than what is given by the original analytic result.

Citations (3)

Summary

We haven't generated a summary for this paper yet.