Dice Question Streamline Icon: https://streamlinehq.com

Relaxing noise assumptions in tuning‑free upper bounds to expected smoothness

Establish tuning‑free optimization guarantees for the algorithms analyzed in this paper (including DoG and DoWG in the convex smooth/Lipschitz settings and the restarted SGD variant in the nonconvex smooth setting) under noise models characterized by expected smoothness of the stochastic gradients, rather than assuming bounded or sub‑Gaussian gradient noise norms.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper’s positive results rely on relatively strong noise assumptions: bounded almost‑sure noise or sub‑Gaussian norm of the gradient noise. In the conclusion, the authors explicitly raise the question of extending these guarantees to weaker, widely‑used stochastic models such as expected smoothness (e.g., as formalized by Gower et al., 2019).

Resolving this would broaden the applicability of tuning‑free optimization beyond the current restrictive assumptions and potentially align the theory with practical stochastic settings.

References

"The upper bounds we develop in both the convex and nonconvex settings require quite stringent assumptions on the noise (such as boundedness or sub-gaussian norm), and it is not known if they can be relaxed to expected smoothness~\citep{gower19_sgd,khaled20_better_theor_sgd_noncon_world} or some variant of it. We leave these questions to future work."

Tuning-Free Stochastic Optimization (2402.07793 - Khaled et al., 12 Feb 2024) in Section 7. Conclusion and Open Problems