Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed Optimisation with Linear Equality and Inequality Constraints using PDMM (2309.12897v2)

Published 22 Sep 2023 in cs.DC

Abstract: In this paper, we consider the problem of distributed optimisation of a separable convex cost function over a graph, where every edge and node in the graph could carry both linear equality and/or inequality constraints. We show how to modify the primal-dual method of multipliers (PDMM), originally designed for linear equality constraints, such that it can handle inequality constraints as well. The proposed algorithm does not need any slack variables, which is similar to the recent work [1] which extends the alternating direction method of multipliers (ADMM) for addressing decomposable optimisation with linear equality and inequality constraints. Using convex analysis, monotone operator theory and fixed-point theory, we show how to derive the update equations of the modified PDMM algorithm by applying Peaceman-Rachford splitting to the monotonic inclusion related to the lifted dual problem. To incorporate the inequality constraints, we impose a non-negativity constraint on the associated dual variables. This additional constraint results in the introduction of a reflection operator to model the data exchange in the network, instead of a permutation operator as derived for equality constraint PDMM. Convergence for both synchronous and stochastic update schemes of PDMM are provided. The latter includes asynchronous update schemes and update schemes with transmission losses. Experiments show that PDMM converges notably faster than extended ADMM of [1].

Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets