Global Convergence and Acceleration for Single Observation Gradient Free Optimization (2509.04424v1)
Abstract: Simultaneous perturbation stochastic approximation (SPSA) is an approach to gradient-free optimization introduced by Spall as a simplification of the approach of Kiefer and Wolfowitz. In many cases the most attractive option is the single-sample version known as 1SPSA, which is the focus of the present paper, containing two major contributions: a modification of the algorithm designed to ensure convergence from arbitrary initial condition, and a new approach to exploration to dramatically accelerate the rate of convergence. Examples are provided to illustrate the theory, and to demonstrate that estimates from unmodified 1SPSA may diverge even for a quadratic objective function.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.