Dice Question Streamline Icon: https://streamlinehq.com

Extensions to weaker smoothness and heavy-tailed noise

Establish convergence and complexity guarantees for the Stochastic Random Search method—i.e., the comparison-based algorithm that updates x_{t+1} = x_t − η_t · sign(M_t^+ − M_t^−) · s_t using two stochastic function evaluations at symmetric perturbations—under smoothness assumptions weaker than (L0, L1)-smoothness and under noise models with heavy-tailed distributions in function evaluations.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper’s analysis of Stochastic Random Search relies on the (L0, L1)-smoothness assumption and bounded-variance noise models to derive rates across stochastic, finite-sum, and helper-feedback settings. These assumptions enable descent bounds via translation invariance and control of the difference-estimator error.

However, the authors note that extending their theoretical guarantees beyond these regularity conditions—specifically to weaker notions of smoothness and to heavy-tailed noise—has not been addressed. Such extensions would broaden applicability to settings where gradients may not satisfy adaptive smoothness or where noise does not admit finite variance, which are common in modern large-scale and human-in-the-loop optimization contexts.

References

Extensions to weaker smoothness or heavy-tailed noise remain open.

Stochastic Optimization with Random Search (2510.15610 - Chayti et al., 17 Oct 2025) in Limitations paragraph, Section "Limitations, Future Work, and Conclusion"