Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generalized gradient optimization over lossy networks for partition-based estimation (1710.10829v1)

Published 30 Oct 2017 in math.OC, cs.SY, and eess.SY

Abstract: We address the problem of distributed convex unconstrained optimization over networks characterized by asynchronous and possibly lossy communications. We analyze the case where the global cost function is the sum of locally coupled local strictly convex cost functions. As discussed in detail in a motivating example, this class of optimization objectives is, for example, typical in localization problems and in partition-based state estimation. Inspired by a generalized gradient descent strategy, namely the block Jacobi iteration, we propose a novel solution which is amenable for a distributed implementation and which, under a suitable condition on the step size, is provably locally resilient to communication failures. The theoretical analysis relies on the separation of time scales and Lyapunov theory. In addition, to show the flexibility of the proposed algorithm, we derive a resilient gradient descent iteration and a resilient generalized gradient for quadratic programming as two natural particularizations of our strategy. In this second case, global robustness is provided. Finally, the proposed algorithm is numerically tested on the IEEE 123 nodes distribution feeder in the context of partition-based smart grid robust state estimation in the presence of measurements outliers.

Citations (3)

Summary

We haven't generated a summary for this paper yet.