Avoid extreme conservativeness without sample splitting when d = o(n^{2/3})
Establish that, when the initial estimator of the parameter (\hat{\theta}_1) and the estimator of the curvature/Hessian (\hat{V}_G) are computed from the same data rather than from independent splits, the studentized and bias-corrected universal inference procedure for the Kullback–Leibler projection parameter \theta_G avoids extreme conservativeness—i.e., its miscoverage probability at a fixed nominal level does not converge to zero—as the sample size n grows, provided the parameter dimension satisfies d = o(n^{2/3}). In particular, prove such a result under the regularity framework of Section 2.2, potentially by verifying the second-order bias requirement n^{1/2}||\hat{\theta}_1-\theta_G||_{I_G} || I_G^{-1/2}(\hat{V}_G - V_G) I_G^{-1/2} ||_{op} = o_P(1) or an analogous condition sufficient to preclude extreme conservativeness.
References
Mirroring the results in \citet{chang2023inference}, we conjecture that extreme conservativeness can be avoided when $d = o(n{2/3})$.