Directional limiting subdifferentials are refinements of classical subdifferentials that capture first-order sensitivity along specified directions.
They yield sharper optimality conditions and require weaker qualification for calculus rules, thus enhancing sensitivity estimates in variational analysis.
Their calculus extends to chain, sum, maximum, and asymptotic cases, aiding in practical optimizations and value-function estimations.
A directional limiting subdifferential is a central object in variational analysis, refining the classical Mordukhovich subdifferential to describe the first-order sensitivity of nonsmooth functions and value mappings along prescribed directions. By constraining the limiting process to sequences approaching a reference point or infinity along a designated direction, these constructions yield sharper optimality conditions, weaker qualification requirements for calculus rules, and more localized sensitivity estimates. Directional limiting subdifferentials are fundamental in characterizing directional Lipschitz properties, deriving upper estimates for value functions in parametric optimization, and extending classical results such as Danskin’s and Gauvin–Dubeau’s theorems.
1. Formal Definition and Notational Framework
Let f:Rn→R be lower semicontinuous at $\bar x\in\dom f$ and u∈Rn a fixed direction. The directional limiting subdifferential ∂f(xˉ;u) and singular subdifferential ∂∞f(xˉ;u) are defined as
where ∂f(x) is the Fréchet subdifferential, and xkuxˉ requires xk=xˉ+tkuk with $\bar x\in\dom f$0, $\bar x\in\dom f$1 (Bai et al., 2022). When $\bar x\in\dom f$2, these reduce to the Mordukhovich (limiting) and horizon subdifferentials.
Alternatively, via normal cones to the epigraph, for extended direction $\bar x\in\dom f$3,
$\bar x\in\dom f$4
where the directional limiting normal cone $\bar x\in\dom f$5 collects limits of proximal normals along sequences tangent to $\bar x\in\dom f$6 (Benko et al., 2017). This refines the allocation of subgradients to directions of approach and agrees with the standard limiting subdifferential for $\bar x\in\dom f$7.
2. Calculus Rules: Chain, Sum, and Maxima
Directional limiting subdifferentials admit a calculus that parallels, but sharpens, the classical theory. Under weak (directional) metric subregularity conditions:
Chain Rule: For $\bar x\in\dom f$8, with $\bar x\in\dom f$9 continuous, u∈Rn0 l.s.c., and suitable subregularity,
u∈Rn1
where u∈Rn2 is the coderivative, and the union restricts u∈Rn3 to admissible directions (Benko et al., 2017).
Sum Rule: For u∈Rn4, u∈Rn5 l.s.c. and at most one non-calm in u∈Rn6,
u∈Rn7
Product Rule: If u∈Rn8, u∈Rn9 are directionally differentiable and calm in ∂f(xˉ;u)0,
∂f(xˉ;u)1
Maximum Rule: For ∂f(xˉ;u)2,
∂f(xˉ;u)3
where ∂f(xˉ;u)4 is the active set in direction ∂f(xˉ;u)5 (Benko et al., 2017).
When ∂f(xˉ;u)6 is directionally Lipschitz in ∂f(xˉ;u)7, all directional subdifferentials are nonempty, and these rules fully enable the application of variational analysis techniques in a directionally localized regime (Qin et al., 2023).
3. Directional Subdifferentials for Value Functions in Optimization
For a parametric program
∂f(xˉ;u)8
the directional limiting subdifferential ∂f(xˉ;u)9 captures first-order sensitivity to perturbations of ∂∞f(xˉ;u)0 along ∂∞f(xˉ;u)1 (Bai et al., 2022). Under directional inf-compactness, metric subregularity, and differentiability/geometric derivability hypotheses, the main upper-estimate result is
∂∞f(xˉ;u)2
where ∂∞f(xˉ;u)3 are sets of generalized Lagrange multipliers over the directional critical and linearization cones. The analogous result holds for ∂∞f(xˉ;u)4 (Bai et al., 2022).
Specializations include:
When all data are ∂∞f(xˉ;u)5, the formulas collapse to directional gradients;
Additive perturbations and parameter-independent constraints yield simplified multiplier sets;
Pure equality/inequality constraints recover classical KKT-type multipliers directed along ∂∞f(xˉ;u)6.
This framework generalizes and recovers Danskin’s theorem and Gauvin–Dubeau-type sensitivity formulas in the fully nonsmooth, constrained, and directionally localized settings.
4. Directional Limiting Subdifferential at Infinity
Directional limiting subdifferentials at infinity extend the theory to asymptotic analysis of unbounded functions and sets (Kien et al., 10 Oct 2025). For ∂∞f(xˉ;u)7 l.s.c., with ∂∞f(xˉ;u)8 in the recession direction of ∂∞f(xˉ;u)9,
with the singular subdifferential ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},1 similarly defined with ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},2.
Main calculus rules at infinity include:
Sum rule:∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},3 under a singular-subdifferential qualification.
Max rule: Convexifying the directional limits of the summands yields the max rule.
Illustrative examples clarify that classical stationarity and error bounds may be recovered or vacuously satisfied at infinity in certain degenerate cases (Kien et al., 10 Oct 2025).
5. Characterizations, Optimality, and Lipschitz Criteria
A locally l.s.c. function is directionally Lipschitz at ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},4 in ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},5 if and only if ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},6. For parametric value functions ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},7,
A sufficient condition for directionally Lipschitz continuity is the vanishing of all directional singular subgradients in the upper estimate set.
As for optimality, if ∂f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,ξk→ξ,f(xk)→f(xˉ),ξk∈∂f(xk)},9 is a directional local minimizer of ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},0 along ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},1, then ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},2. Conversely, ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},3 and directional Lipschitzness in ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},4 imply directional local minimality (Qin et al., 2023).
6. Algorithmic Construction and Numerical Aspects
For ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},5 polyhedral, convex, or expressed as a finite max of ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},6 functions, the vertices of the directional limiting subdifferential (the support polytope) can be reconstructed from finitely many directional derivatives. The number of required directions is sharply bounded:
In ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},7: 1 or 2;
In ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},8: at most ∂∞f(xˉ;u):={ξ∈Rn∣∃xkuxˉ,τk↓0,ξk∈∂f(xk),f(xk)→f(xˉ),τkξk→ξ},9 (if ∂f(x)0 is the number of vertices);
In ∂f(x)4, the "compass difference" formula constructs a valid Clarke subgradient using four compass-directional derivatives; centered finite differences converge to an element of the generalized gradient for bivariate nonsmooth functions (Khan et al., 2020).
7. Illustrative Examples and Special Cases
For ∂f(x)5 at ∂f(x)6, ∂f(x)7, ∂f(x)8, while the global Mordukhovich subdifferential is ∂f(x)9 (Qin et al., 2023).
For a simple LP value mapping xkuxˉ0, xkuxˉ1 is smooth and xkuxˉ2 for any xkuxˉ3 (Bai et al., 2022).
The support function of a compact convex set in xkuxˉ4 admits a midpoint subgradient constructed from compass differences, a result that fails in higher dimensions (Khan et al., 2020).
Directional limiting subdifferentials thus provide a refined and computationally tractable tool for variational analysis, nonsmooth optimization, and sensitivity in both finite and asymptotic (infinite) regimes.