Adaptivity and memory for OGM

Ascertain whether the Optimized Gradient Method (OGM) can be endowed with an adaptive mechanism and with memory that leverages stored past oracle information.

Background

OGM achieves the best known worst-case rate among first-order methods for smooth convex minimization but is typically presented in a non-adaptive, memory-less form. In contrast, the authors’ OGMM employs primal-dual estimate functions, integrates memory, and supports an adaptive convergence guarantee adjustment procedure that can act as a line-search replacement.

Establishing whether OGM itself can incorporate such adaptive and memory features—while maintaining its optimized performance—would expand the practical utility of OGM and bridge it with adaptive, memory-based schemes.

References

An open question remains: whether the estimate sequence can be used to derive the original online OGM without augmentation and whether OGM itself can be endowed with an adaptive mechanism and memory.

An optimal lower bound for smooth convex functions (2404.18889 - Florea et al., 29 Apr 2024) in Section 8, Discussion (label_124)