Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient-push algorithm for distributed optimization with event-triggered communications (2111.06315v1)

Published 11 Nov 2021 in math.OC

Abstract: Decentralized optimization problems consist of multiple agents connected by a network. The agents have each local cost function, and the goal is to minimize the sum of the functions cooperatively. It requires the agents communicate with each other, and reducing the cost for communication is desired for a communication-limited environment. In this work, we propose a gradient-push algorithm involving event-triggered communication on directed network. Each agent sends its state information to its neighbors only when the difference between the latest sent state and the current state is larger than a threshold. The convergence of the algorithm is established under a decay and a summability condition on a stepsize and a triggering threshold. Numerical experiments are presented to support the effectiveness and the convergence results of the algorithm.

Summary

We haven't generated a summary for this paper yet.