Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Input-Feedforward-Passivity-Based Distributed Optimization Over Jointly Connected Balanced Digraphs (1905.03468v3)

Published 9 May 2019 in math.OC and cs.MA

Abstract: In this paper, a distributed optimization problem is investigated via input feedforward passivity. First, an input-feedforward-passivity-based continuous-time distributed algorithm is proposed. It is shown that the error system of the proposed algorithm can be decomposed into a group of individual input feedforward passive (IFP) systems that interact with each other using output feedback information. Based on this IFP framework, convergence conditions of a suitable coupling gain are derived over weight-balanced and uniformly jointly strongly connected (UJSC) topologies. It is also shown that the IFP-based algorithm converges exponentially when the topology is strongly connected. Second, a novel distributed derivative feedback algorithm is proposed based on the passivation of IFP systems. While most works on directed topologies require knowledge of eigenvalues of the graph Laplacian, the derivative feedback algorithm is fully distributed, namely, it is robust against randomly changing weight-balanced digraphs with any positive coupling gain and without knowing any global information. Finally, numerical examples are presented to illustrate the proposed distributed algorithms.

Citations (21)

Summary

We haven't generated a summary for this paper yet.