Dice Question Streamline Icon: https://streamlinehq.com

Convexity of the gradient-descent optimization curve for step sizes between 1/L and 1.75/L

Determine whether, for gradient descent on convex L-smooth functions with step sizes η in the interval (1/L, 1.75/L], the optimization curve—defined as the linear interpolation of the sequence {(n, f(x_n))} induced by x_{n}=x_{n-1}-η∇f(x_{n-1})—is necessarily convex for all functions and initializations, or whether there exist convex L-smooth functions and initializations in this step-size range that yield a non-convex optimization curve.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper studies when the optimization curve induced by gradient descent on convex L-smooth functions is convex. Theorem 1 proves convexity for step sizes η ∈ (0, 1/L], while Theorem 2 shows that for η in (1.75/L, 2/L), even though gradient descent converges monotonically, the optimization curve can be non-convex.

This leaves a gap for η ∈ (1/L, 1.75/L], where it is unknown whether convexity always holds or counterexamples exist. Resolving this would complete the characterization of optimization-curve convexity across the full convergence regime η ∈ (0, 2/L).

References

This leaves open the behavior in the regime \eta\in(\frac{1}{L},\frac{1.75}{L}].

Are Convex Optimization Curves Convex? (2503.10138 - Barzilai et al., 13 Mar 2025) in Conclusion and Discussion, end of section