Extensions to weaker smoothness and heavy-tailed noise
Establish convergence and complexity guarantees for the Stochastic Random Search method—i.e., the comparison-based algorithm that updates x_{t+1} = x_t − η_t · sign(M_t^+ − M_t^−) · s_t using two stochastic function evaluations at symmetric perturbations—under smoothness assumptions weaker than (L0, L1)-smoothness and under noise models with heavy-tailed distributions in function evaluations.
References
Extensions to weaker smoothness or heavy-tailed noise remain open.
— Stochastic Optimization with Random Search
(2510.15610 - Chayti et al., 17 Oct 2025) in Limitations paragraph, Section "Limitations, Future Work, and Conclusion"