Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Equivalent Circuit Approach to Distributed Optimization (2305.14607v1)

Published 24 May 2023 in eess.SY, cs.SY, and math.OC

Abstract: Distributed optimization is an essential paradigm to solve large-scale optimization problems in modern applications where big-data and high-dimensionality creates a computational bottleneck. Distributed optimization algorithms that exhibit fast convergence allow us to fully utilize computing resources and effectively scale to larger optimization problems in a myriad of areas ranging from machine learning to power systems. In this work, we introduce a new centralized distributed optimization algorithm (ECADO) inspired by an equivalent circuit model of the distributed problem. The equivalent circuit (EC) model provides a physical analogy to derive new insights to develop a fast-convergent algorithm. The main contributions of this approach are: 1) a weighting scheme based on a circuit-inspired aggregate sensitivity analysis, and 2) an adaptive step-sizing derived from a stable, Backward-Euler numerical integration. We demonstrate that ECADO exhibits faster convergence compared to state-of-the art distributed optimization methods and provably converges for nonconvex problems. We leverage the ECADO features to solve convex and nonconvex optimization problems with large datasets such as: distributing data for logistic regression, training a deep neural network model for classification, and solving a high-dimensional problem security-constrained optimal power flow problem. Compared to state-of-the-art centralized methods, including ADMM, centralized gradient descent, and DANE, this new ECADO approach is shown to converge in fewer iterations.

Citations (1)

Summary

We haven't generated a summary for this paper yet.