Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Performance Improvement in Noisy Linear Consensus Networks with Time-Delay (1810.08287v1)

Published 18 Oct 2018 in cs.SY

Abstract: We analyze performance of a class of time-delay first-order consensus networks from a graph topological perspective and present methods to improve it. The performance is measured by network's square of H-2 norm and it is shown that it is a convex function of Laplacian eigenvalues and the coupling weights of the underlying graph of the network. First, we propose a tight convex, but simple, approximation of the performance measure in order to achieve lower complexity in our design problems by eliminating the need for eigen-decomposition. The effect of time-delay reincarnates itself in the form of non-monotonicity, which results in nonintuitive behaviors of the performance as a function of graph topology. Next, we present three methods to improve the performance by growing, re-weighting, or sparsifying the underlying graph of the network. It is shown that our suggested algorithms provide near-optimal solutions with lower complexity with respect to existing methods in literature.

Citations (6)

Summary

We haven't generated a summary for this paper yet.