Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nuisance Function Tuning and Sample Splitting for Optimal Doubly Robust Estimation (2212.14857v3)

Published 30 Dec 2022 in math.ST, stat.ME, stat.ML, and stat.TH

Abstract: Estimators of doubly robust functionals typically rely on estimating two complex nuisance functions, such as the propensity score and conditional outcome mean for the average treatment effect functional. We consider the problem of how to estimate nuisance functions to obtain optimal rates of convergence for a doubly robust nonparametric functional that has witnessed applications across the causal inference and conditional independence testing literature. For several plug-in estimators and a first-order bias-corrected estimator, we illustrate the interplay between different tuning parameter choices for the nuisance function estimators and sample splitting strategies on the optimal rate of estimating the functional of interest. For each of these estimators and each sample splitting strategy, we show the necessity to either undersmooth or oversmooth the nuisance function estimators under low regularity conditions to obtain optimal rates of convergence for the functional of interest. Unlike the existing literature, we show that plug-in and first-order biased-corrected estimators can achieve minimax rates of convergence across all H\"older smoothness classes of the nuisance functions by careful combinations of sample splitting and nuisance function tuning strategies.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets