Papers
Topics
Authors
Recent
Search
2000 character limit reached

Zeroth-Order Katyusha: An Accelerated Derivative-Free Method for Composite Convex Optimization

Published 12 Jul 2024 in math.OC | (2407.09190v1)

Abstract: We investigate accelerated zeroth-order algorithms for smooth composite convex optimization problems. While for unconstrained optimization, existing methods that merge 2-point zeroth-order gradient estimators with first-order frameworks usually lead to satisfactory performance, for constrained/composite problems, there is still a gap in the complexity bound that is related to the non-vanishing variance of the 2-point gradient estimator near an optimal point. To bridge this gap, we propose the Zeroth-Order Loopless Katyusha (ZO-L-Katyusha) algorithm, leveraging the variance reduction as well as acceleration techniques from the first-order loopless Katyusha algorithm. We show that ZO-L-Katyusha is able to achieve accelerated linear convergence for compositve smooth and strongly convex problems, and has the same oracle complexity as the unconstrained case. Moreover, the number of function queries to construct a zeroth-order gradient estimator in ZO-L-Katyusha can be made to be O(1) on average. These results suggest that ZO-L-Katyusha provides a promising approach towards bridging the gap in the complexity bound for zeroth-order composite optimization.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.