A Structured Proximal Stochastic Variance Reduced Zeroth-order Algorithm (2506.23758v1)
Abstract: Minimizing finite sums of functions is a central problem in optimization, arising in numerous practical applications. Such problems are commonly addressed using first-order optimization methods. However, these procedures cannot be used in settings where gradient information is unavailable. Finite-difference methods provide an alternative by approximating gradients through function evaluations along a set of directions. For finite-sum minimization problems, it was shown that incorporating variance-reduction techniques into finite-difference methods can improve convergence rates. Additionally, recent studies showed that imposing structure on the directions (e.g., orthogonality) enhances performance. However, the impact of structured directions on variance-reduced finite-difference methods remains unexplored. In this work, we close this gap by proposing a structured variance-reduced finite-difference algorithm for non-smooth finite-sum minimization. We analyze the proposed method, establishing convergence rates for non-convex functions and those satisfying the Polyak-{\L}ojasiewicz condition. Our results show that our algorithm achieves state-of-the-art convergence rates while incurring lower per-iteration costs. Finally, numerical experiments highlight the strong practical performance of our method.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.