Establish localized sup-norm risk bounds for broader estimator classes

Establish that standard local nonparametric estimators (such as kernel and local polynomial estimators) and series least squares estimators (such as spline and wavelet series) achieve the localized sup-norm risk bound E sup_{x' ∈ [0,1]^d ∩ B_p(x,r)} [f(x') − f*(x')]^2 ≲ r^{2β} + n^{-2β/(2β+d)} for any x ∈ [0,1]^d, any attack radius r = O(1), and any regression function f* in the Hölder class H(β, L), under sub-Gaussian errors, provided their regularization parameters are chosen appropriately.

Background

The paper’s minimax upper bounds for adversarial risk rely on a base estimator that satisfies a localized sup-norm risk condition: E sup_{x' ∈ X ∩ B_p(x,r)} [f(x') − f*(x')]2 ≲ r{2β} + n{-2β/(2β+d)}. The authors show this condition can be met by a specific piecewise local polynomial estimator under sub-Gaussian errors.

They explicitly conjecture that widely used estimators—other local nonparametric methods and series least squares estimators—should also satisfy the same localized sup-norm risk bound when their tuning or regularization parameters are properly selected. Proving this would broaden the applicability of their minimax robustness results beyond the particular construction they use.

References

It is also conjectured that other local nonparametric estimators \citep[see, e.g.,][]{Stone1982, BERTIN2004225, GAIFFAS2007782, Tsybakov2009} and series least squares estimators \citep[see, e.g.,][]{CHEN2015447, BELLONI2015345} may also achieve eq:local_risk provided their regularization parameters are chosen properly.

eq:local_risk:

EsupxXBp(x,r)[f(x)f(x)]2r2β+n2β2β+d\mathbb{E}\sup_{x' \in \mathcal{X} \cap B_p(x,r)} \big[ {f}(x') - f^*(x') \big]^2 \lesssim r^{2\beta} + n^{-\frac{2\beta}{2\beta+d}}

On damage of interpolation to adversarial robustness in regression  (2601.16070 - Peng et al., 22 Jan 2026) in Section 3.1 (Minimax rates), Step 1