Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

First order constrained optimization algorithms with feasibility updates (1506.08247v1)

Published 27 Jun 2015 in math.OC

Abstract: We propose first order algorithms for convex optimization problems where the feasible set is described by a large number of convex inequalities that is to be explored by subgradient projections. The first algorithm is an adaptation of a subgradient algorithm, and has convergence rate $1/\sqrt{k}$. The second algorithm has convergence rate 1/k when (1) one has linear metric inequality in the feasible set, (2) the objective function is strongly convex, differentiable and has Lipschitz gradient, and (3) it is easy to optimize the objective function on the intersection of two halfspaces. This second algorithm generalizes Haugazeau's algorithm. The third algorithm adapts the second algorithm when condition (3) is dropped. We give examples to show that the second algorithm performs poorly when the objective function is not strongly convex, or when the linear metric inequality is absent.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)