Derandomize accelerated GD on separable convex functions
Determine whether there exists a deterministic stepsize schedule for Gradient Descent that achieves the fully accelerated iteration complexity O(kappa^{1/2} log(1/eps)) on the class of separable, m-strongly convex and M-smooth functions, and in particular ascertain whether some ordering of the Chebyshev stepsize sequence attains this rate.
References
Is it possible to de-randomize Theorem~\ref{thm:sep:main}, i.e., construct a deterministic stepsize schedule which achieves the fully accelerated rate for separable functions? If so, a natural conjecture would be some ordering of the Chebyshev stepsize schedule.
                — Acceleration by Random Stepsizes: Hedging, Equalization, and the Arcsine Stepsize Schedule
                
                (2412.05790 - Altschuler et al., 8 Dec 2024) in Section 7 (Conclusion and future work)