Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
15 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
Gemini 2.5 Flash Deprecated
12 tokens/sec
2000 character limit reached

Convergence Rate of Accelerated Average Consensus with Local Node Memory: Optimization and Analytic Solutions (2110.09678v2)

Published 19 Oct 2021 in math.OC, cs.SY, and eess.SY

Abstract: Previous researches have shown that adding local memory can accelerate the consensus. It is natural to ask questions like what is the fastest rate achievable by the $M$-tap memory acceleration, and what are the corresponding control parameters. This paper introduces a set of effective and previously unused techniques to analyze the convergence rate of accelerated consensus with $M$-tap memory of local nodes and to design the control protocols. These effective techniques, including the Kharitonov stability theorem, the Routh stability criterion and the robust stability margin, have led to the following new results: 1) the direct link between the convergence rate and the control parameters; 2) explicit formulas of the optimal convergence rate and the corresponding optimal control parameters for $M \leq 2$ on a given graph; 3) the optimal worst-case convergence rate and the corresponding optimal control parameters for the memory $M \geq 1$ on a set of uncertain graphs. We show that the acceleration with the memory $M = 1$ provides the optimal convergence rate in the sense of worst-case performance. Several numerical examples are given to demonstrate the validity and performance of the theoretical results.

Citations (13)

Summary

We haven't generated a summary for this paper yet.