Dice Question Streamline Icon: https://streamlinehq.com

Existence of a universal lower bound for the oracle k* without extra assumptions

Determine whether any universal method exists to derive an explicit lower bound for the oracle index k*(δ, n), defined by Equation (kstar) as the largest k in the grid K such that B(k, n, δ) ≤ γ V(k, δ) in the bias–variance decomposition of Condition 1 (Bias-Variance Decomposition), without imposing additional assumptions beyond that condition (e.g., without second-order or von Mises assumptions); alternatively, prove that no such universal method exists.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper introduces an oracle extreme sample size k*(δ, n), defined to balance the deviation term V(k, δ) and the bias term B(k, n, δ) in a non-asymptotic bias–variance decomposition (Condition 1). This oracle enables explicit error bounds but requires knowledge of k*(δ, n), which itself is unknown.

Under only the minimal assumptions encapsulated by Condition 1, the authors highlight that making the oracle bound fully explicit hinges on obtaining a lower bound on k*(δ, n). While they provide such bounds under additional von Mises (second-order) conditions in Section 4, they conjecture that achieving a universal lower bound without further assumptions may be impossible.

References

However, since $(\delta,n)$ itself is unknown, this bound is not fully explicit unless a lower bound on $(\delta,n)$ can be derived. We conjecture that there is no universal method to achieve this without further assumptions.

Adaptive tail index estimation: minimal assumptions and non-asymptotic guarantees (2505.22371 - Lederer et al., 28 May 2025) in Section 2.1 (Adaptive Validation framework)