Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fixed-Time Stable Gradient Flows: Applications to Continuous-Time Optimization (1808.10474v5)

Published 30 Aug 2018 in math.OC

Abstract: This paper proposes novel gradient-flow schemes that yield convergence to the optimal point of a convex optimization problem within a \textit{fixed} time from any given initial condition for unconstrained optimization, constrained optimization, and min-max problems. The application of the modified gradient flow to unconstrained optimization problems is studied under the assumption of gradient-dominance. Then, a modified Newton's method is presented that exhibits fixed-time convergence under some mild conditions on the objective function. Building upon this method, a novel technique for solving convex optimization problems with linear equality constraints that yields convergence to the optimal point in fixed time is developed. More specifically, constrained optimization problems formulated as min-max problems are considered, and a novel method for computing the optimal solution in fixed-time is proposed using the Lagrangian dual. Finally, the general min-max problem is considered, and a modified scheme to obtain the optimal solution of saddle-point dynamics in fixed time is developed. Numerical illustrations that compare the performance of the proposed method against Newton's method, rescaled-gradient method, and Nesterov's accelerated method are included to corroborate the efficacy and applicability of the modified gradient flows in constrained and unconstrained optimization problems.

Summary

We haven't generated a summary for this paper yet.