Dice Question Streamline Icon: https://streamlinehq.com

Convergence guarantees to solutions of the original constrained problem

Establish convergence guarantees that the zeroth-order block gradient descent ascent (ZOB-GDA) and zeroth-order block smoothed gradient descent ascent (ZOB-SGDA) algorithms converge to solutions of the original black-box constrained optimization problem min h(x) subject to c_j(x) ≤ 0 for all j (Equation (1)), rather than only to stationary points of the nonconvex–concave Lagrangian min–max reformulation f(x, y) (Equation (2)).

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper proposes two zeroth-order algorithms, ZOB-GDA and ZOB-SGDA, for constrained black-box optimization by reformulating the problem as a nonconvex–concave min–max problem and proving convergence to stationary points of the Lagrangian f(x, y).

While a lemma shows that, under an additional condition (y strictly below the imposed upper bound), a stationary point of f(x, y) yields a critical KKT point of the original constrained problem, the authors explicitly note that full convergence guarantees to solutions of the original problem remain unestablished. Closing this gap would provide guarantees directly in terms of the original constrained optimization objective and constraints.

References

Our theoretical results establish convergence guarantees to stationary points of f(x,y) for the proposed algorithms, while the convergence guarantees to the solutions to problem (\ref{eq:constrained_problem}) are yet to be established.

Query-Efficient Zeroth-Order Algorithms for Nonconvex Optimization (2510.19165 - Jin et al., 22 Oct 2025) in Section: Discussions (Stationary Points of (2) Can Provide Solutions to (1))