Incorporating finite-difference approximation error into ZO-RS-SQP analysis

Establish convergence guarantees for the zeroth-order random-subspace sequential quadratic programming method (ZO-RS-SQP) that explicitly account for the finite-difference approximation error introduced by two-point directional estimators used to construct the projected objective gradient and constraint Jacobians, rather than assuming access to the exact reduced SQP model induced by the sampled subspace.

Background

The paper analyzes a constrained zeroth-order optimization method that combines random subspace sampling with SQP-style updates. In the theoretical development, the convergence analysis focuses on an exact reduced SQP model that assumes access to exact projected gradients and Jacobians within the sampled subspace.

However, the practical algorithm relies on two-point finite-difference estimators to approximate these projected quantities using only function evaluations. The authors explicitly state that incorporating the resulting approximation error into the convergence analysis is deferred, indicating an unresolved aspect necessary to fully justify the guarantees for the implemented zeroth-order method.

References

To isolate the effect of subspace restriction in the current analysis, Section~\ref{sec:conv-analysis} studies the exact reduced SQP model induced by the sampled subspace; incorporating the finite-difference approximation error of the fully zeroth-order implementation is left for future work.

Random-Subspace Sequential Quadratic Programming for Constrained Zeroth-Order Optimization  (2604.02202 - Zhang et al., 2 Apr 2026) in Introduction, final paragraph (end of Section 1)