Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Iteration Complexity in Black-Box Optimization Problems under Higher Order Smoothness Function Condition (2407.03507v1)

Published 3 Jul 2024 in math.OC

Abstract: This paper is devoted to the study (common in many applications) of the black-box optimization problem, where the black-box represents a gradient-free oracle $\tilde{f} = f(x) + \xi$ providing the objective function value with some stochastic noise. Assuming that the objective function is $\mu$-strongly convex, and also not just $L$-smooth, but has a higher order of smoothness ($\beta \geq 2$) we provide a novel optimization method: Zero-Order Accelerated Batched Stochastic Gradient Descent, whose theoretical analysis closes the question regarding the iteration complexity, achieving optimal estimates. Moreover, we provide a thorough analysis of the maximum noise level, and show under which condition the maximum noise level will take into account information about batch size $B$ as well as information about the smoothness order of the function $\beta$.

Summary

We haven't generated a summary for this paper yet.