Optimality of the Silver Stepsize Schedule among deterministic schedules
Prove that the Silver Convergence Rate O(kappa^{log_{1+sqrt{2}} 2} log(1/eps)) achieved by the Silver Stepsize Schedule is optimal among all deterministic stepsize schedules for Gradient Descent on the class of m-strongly convex and M-smooth convex functions, thereby ruling out any deterministic schedule achieving a strictly better asymptotic dependence on the condition number.
References
This rate is conjecturally optimal among all possible deterministic stepsize schedules~\citep{alt23hedging1} and naturally extends to non-strongly convex optimization~\citep{alt23hedging2}.
                — Acceleration by Random Stepsizes: Hedging, Equalization, and the Arcsine Stepsize Schedule
                
                (2412.05790 - Altschuler et al., 8 Dec 2024) in Section 1.2 (Contribution and discussion)