Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

Improving the Robustness of the Projected Gradient Descent Method for Nonlinear Constrained Optimization Problems in Topology Optimization (2412.07634v1)

Published 10 Dec 2024 in math.OC, math-ph, and math.MP

Abstract: The Projected Gradient Descent (PGD) algorithm is a widely used and efficient first-order method for solving constrained optimization problems due to its simplicity and scalability in large design spaces. Building on recent advancements in the PGD algorithm where an inertial step component has been introduced to improve efficiency in solving constrained optimization problems this study introduces two key enhancements to further improve the algorithm's performance and adaptability in large-scale design spaces. First, univariate constraints (such as design variable bounds constraints) are directly incorporated into the projection step via the Schur complement and an improved active set algorithm with bulk constraints manipulation, avoiding issues with min-max clipping. Second, the update step is decomposed relative to the constraint vector space, enabling a post-projection adjustment based on the state of the constraints and an approximation of the Lagrangian, significantly improving the algorithm's robustness for problems with nonlinear constraints. Applied to a topology optimization problem for heat sink design, the proposed PGD algorithm demonstrates performance comparable to or exceeding that of the Method of Moving Asymptotes (MMA), with minimal parameter tuning. These results position the enhanced PGD as a robust tool for complex optimization problems with large variable space, such as topology optimization problems.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.