Projected Subgradient Ascent for Convex Maximization (2511.00741v1)
Abstract: We consider the problem of maximizing a convex function over a closed convex set. Classical methods solve such problems using iterative schemes that repeatedly improve a solution. For linear maximization, we show that a single orthogonal projection suffices to obtain an approximate solution. For general convex functions over convex sets, we show that projected subgradient ascent converges to a first-order stationary point when using arbitrarily large step sizes. Taking the step size to infinity leads to the conditional gradient algorithm, and iterated linear optimization as a special case. We illustrate numerical experiments using a single projection for linear optimization in the elliptope, reducing the problem to the computation of a nearest correlation matrix.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.