First order constrained optimization algorithms with feasibility updates (1506.08247v1)
Abstract: We propose first order algorithms for convex optimization problems where the feasible set is described by a large number of convex inequalities that is to be explored by subgradient projections. The first algorithm is an adaptation of a subgradient algorithm, and has convergence rate $1/\sqrt{k}$. The second algorithm has convergence rate 1/k when (1) one has linear metric inequality in the feasible set, (2) the objective function is strongly convex, differentiable and has Lipschitz gradient, and (3) it is easy to optimize the objective function on the intersection of two halfspaces. This second algorithm generalizes Haugazeau's algorithm. The third algorithm adapts the second algorithm when condition (3) is dropped. We give examples to show that the second algorithm performs poorly when the objective function is not strongly convex, or when the linear metric inequality is absent.