2000 character limit reached
A constrained risk inequality for general losses (1804.08116v3)
Published 22 Apr 2018 in math.ST and stat.TH
Abstract: We provide a general constrained risk inequality that applies to arbitrary non-decreasing losses, extending a result of Brown and Low [Ann. Stat. 1996]. Given two distributions $P_0$ and $P_1$, we find a lower bound for the risk of estimating a parameter $\theta(P_1)$ under $P_1$ given an upper bound on the risk of estimating the parameter $\theta(P_0)$ under $P_0$. The inequality is a useful pedagogical tool, as its proof relies only on the Cauchy-Schwartz inequality, it applies to general losses, and it transparently gives risk lower bounds on super-efficient and adaptive estimators.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.