Papers
Topics
Authors
Recent
Search
2000 character limit reached

SLSQP: Sequential Least Squares Programming

Updated 10 April 2026
  • Sequential Least Squares Quadratic Programming (SLSQP) is a derivative-based optimization method that iteratively solves quadratic subproblems to enforce both equality and inequality constraints.
  • It employs quasi-Newton updates, finite-difference approximations, and line search techniques to achieve global convergence even on ill-conditioned problems.
  • Modern variants like PySLSQP and I-SLSQP enhance transparency, enable warm restarts, and improve computational efficiency for medium-scale nonlinear programming tasks.

Sequential Least Squares Quadratic Programming (SLSQP) is a widely adopted algorithm for solving nonlinear programming (NLP) problems subject to both equality and inequality constraints. It iteratively solves a sequence of quadratic or least-squares subproblems, employing quasi-Newton updates, constraint management, and line search techniques to ensure global convergence. Modernizations such as PySLSQP and algorithmic improvements like I-SLSQP have further enhanced its transparency, robustness, and applicability to ill-conditioned and medium-scale optimization problems (Joshy et al., 2024, Ma et al., 2024).

1. Mathematical Structure and Algorithmic Core

At major iteration kk, SLSQP generates a QP subproblem using the current iterate xkx_k, Lagrangian gradient gkg_k, and a symmetric positive-definite Hessian approximation BkB_k: minpRn12pTBkp+gkTp subject toAeq,kp+ceq(xk)=0 Aineq,kp+cineq(xk)0\begin{array}{rl} \min_{p\in\mathbb R^n} & \frac{1}{2}p^T B_k p + g_k^T p \ \text{subject to} & A_{eq,k}p + c_{eq}(x_k) = 0 \ & A_{ineq,k}p + c_{ineq}(x_k) \ge 0 \end{array} where Aeq,kA_{eq,k} and Aineq,kA_{ineq,k} are the Jacobians of the equality and inequality constraints, respectively.

The corresponding Karush-Kuhn-Tucker (KKT) optimality system enforces stationarity, primal feasibility, dual feasibility, and complementary slackness. If second derivatives are unavailable, SLSQP maintains Bk0B_k\succ0 through a limited-memory BFGS update: Bk+1=Bk+ykykTykTskBkskskTBkskTBkskB_{k+1} = B_k + \frac{y_k y_k^T}{y_k^T s_k} - \frac{B_k s_k s_k^T B_k}{s_k^T B_k s_k} for sk=xk+1xks_k = x_{k+1}-x_k and xkx_k0, with suitable damping if xkx_k1 (Joshy et al., 2024).

2. Subproblem Solution and Enhanced Variants

Standard SLSQP forms the QP or, equivalently, a linear constrained least-squares (LSQ) system: xkx_k2 with xkx_k3. When the LSQ subproblem is inconsistent, improved variants like I-SLSQP apply hybrid relaxations:

  • RLSQ1: Modified Powell relaxation introducing a scalar slack xkx_k4.
  • RLSQ2: Nowak-type relaxation using slack vectors with strong penalties.

I-SLSQP dynamically switches between these relaxations depending on feasibility and conditioning, returning the first viable step direction (Ma et al., 2024).

In traditional implementations, dual LSQ solvers leverage nonnegative least squares (NNLS). However, in the presence of tiny denominators (specifically, small xkx_k5 in xkx_k6), numerical cancellation causes catastrophic search direction failures. When such pathologies are detected (e.g., ascending directions or abnormally large step norms), I-SLSQP falls back to a protected QP solver for the LSQ subproblem (Ma et al., 2024).

3. Globalization: Scaling, Derivative Estimation, and Merit Functions

Numerical stability is enhanced in PySLSQP by diagonal scaling of variables and constraints: xkx_k7 with user-supplied scalers xkx_k8, xkx_k9, gkg_k0. Gradients and steps are automatically scaled and unscaled during processing, improving performance on poorly conditioned problems (Joshy et al., 2024).

When analytic derivatives are unavailable, forward finite-differences approximate gradients. PySLSQP supplies finite_diff_abs_step and finite_diff_rel_step options, monitoring function variation to adapt the step gkg_k1 and mitigate cancellation, adapting as necessary down to machine precision.

For globalization, SLSQP employs a backtracking line search on an augmented gkg_k2-merit function: gkg_k3 with adaptive gkg_k4, enforcing sufficient decrease (Joshy et al., 2024).

4. Implementation: PySLSQP Modernization and API

PySLSQP wraps the original SLSQP Fortran kernel from Kraft, using Meson and f2py to produce a compiled _slsqp.so extension. The Python front end (pyslsqp.optimize) manages argument parsing, scaling, finite-difference evaluation, calls into the Fortran driver, and handles data saving and visualization.

The API accepts wide-ranging user options, including warm/hot restarts, variable and constraint scaling, finite-difference tunings, and selection of which algorithmic internals to save:

gkg_k5

Critical features of PySLSQP include:

  • Access and live storage of internal optimizer state (iterates, multipliers, BFGS diagonals, etc.).
  • Seamless warm and hot restarts for rapid re-optimization.
  • Live plotting and post-processing via Matplotlib using HDF5 files, with routines for loading and visualizing optimization histories.
  • Diagnostic log files and rich tools for integrating into research workflows (Joshy et al., 2024).

5. Robustness, Convergence, and Comparative Performance

I-SLSQP and PySLSQP introduce robust handling of ill-conditioned subproblems via their hybrid LSQ/QP solution and dual-LSQ failure safeguards. I-SLSQP in particular only resets the Hessian update when an ascent direction is encountered, avoiding premature loss of curvature information. Its two-group convergence testing (feasibility and optimality) ensures algorithmic termination iff stationarity or step tolerance is met.

In extensive computational experiments across 42 large-scale nonlinear process engineering instances, I-SLSQP was the only tested method to succeed in all cases, reliably overcoming infeasibility declarations that halted fmincon (MATLAB SQP) and IPOPT. PySLSQP—corresponding to the original Kraft/Schittkowski SLSQP—solved nearly all instances but prematurely terminated on two. I-SQP (improved QP-based SQP) performed efficiently on well-conditioned classes but was less robust on ill-conditioned problems.

Benchmarks for medium-scale problems (≈200 variables/constraints) demonstrate that PySLSQP converges within 200 function evaluations on optimal-control tasks where SNOPT, IPOPT, and SciPy’s Trust-Constr failed or returned infeasible solutions under identical evaluation budgets (Joshy et al., 2024, Ma et al., 2024).

6. Practical Considerations, Limitations, and Outlook

PySLSQP provides a transparent, research-oriented interface for SLSQP, exposing all internal quantities, enabling reproducibility, and allowing customization through warm/hot restarts and tuning options. Its flexible scaling and robust finite-difference handlers address longstanding challenges of numerical instability and lack of transparency in black-box optimizers.

For well-conditioned or small to medium NLPs, I-SQP may deliver superior walltime performance due to lower overhead. For ill-conditioned and constraint-dominated problems, I-SLSQP and PySLSQP provide more reliable convergence and better minima, with only modest additional computational cost.

A plausible implication is that SLSQP, coupled with modern transparency and robustification strategies, remains state-of-the-art for medium-scale NLPs requiring constraint feasibility, flexible post-processing, and on-the-fly monitoring. The separation of algorithmic core (Fortran or compiled extension) and flexible Python front end provides an extensible base for future research in large-scale nonlinear optimization (Joshy et al., 2024, Ma et al., 2024).


References

  • "PySLSQP: A transparent Python package for the SLSQP optimization algorithm modernized with utilities for visualization and post-processing" (Joshy et al., 2024)
  • "Improved SQP and SLSQP Algorithms for Feasible Path-based Process Optimisation" (Ma et al., 2024)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Sequential Least Squares Quadratic Programming (SLSQP).