Generalizing the dual-space strong convexity inequality to nonsmooth losses

Determine whether the dual-space strong convexity inequality for L-smooth convex losses ℓt, namely that for all x, y ∈ ℝ^n one has ℓt(x) − ℓt(y) − ⟨∇ℓt(y), x − y⟩ ≥ c‖∇ℓt(x) − ∇ℓt(y)‖^2 for an L-dependent constant c, admits a valid extension to nonsmooth convex loss functions, and specify the exact conditions and formulation under which such a generalized inequality holds.

Background

The paper’s analysis of the proposed G* regret relies on a strengthened lower bound that follows from L-smoothness on ℝn: the deviation from convexity ℓt(x) − ℓt(y) − ⟨∇ℓt(y), x − y⟩ can be lower bounded by a constant multiple of the squared gradient difference ‖∇ℓt(x) − ∇ℓt(y)‖2. This inequality, presented as (3), is a dual-space strong convexity relation and is central to deriving the new regret bounds.

While small-loss (L*) regret bounds can be obtained under the weaker self-boundedness condition, which covers certain nonsmooth losses, the authors note uncertainty about whether the stronger inequality (3) extends beyond smoothness. Establishing an analogue for nonsmooth convex losses would clarify the extent to which the G* regret analysis can be generalized beyond the smooth regime.

References

On the other hand, it is unclear whether (3) can be generalized to handle nonsmooth losses. We leave further investigations to future work.

Small Gradient Norm Regret for Online Convex Optimization  (2601.13519 - Gao et al., 20 Jan 2026) in Remark 5, Section 2.1