Papers
Topics
Authors
Recent
Search
2000 character limit reached

Efron's Theorem: Monotonicity in Log-Concave Models

Updated 7 January 2026
  • Efron's Theorem is a fundamental result establishing monotonicity and self-consistency for conditional expectations of log-concave random variables.
  • It leverages kernel-covariance techniques and the Brascamp–Lieb inequality to demonstrate that sums of independent log-concave variables preserve monotonicity.
  • The theorem underpins practical applications in convex geometry, MCMC algorithms, and nonparametric survival analysis by ensuring stability in statistical estimators.

Efron's Theorem refers to a suite of fundamental monotonicity and self-consistency results established by Bradley Efron, with enduring influence in probability, statistics, and measure theory. The most celebrated instance is Efron's monotonicity theorem for conditional expectations with respect to sums of independent log-concave random variables. Additional foundational contributions include the self-consistency equation in censored survival analysis and mean volume formulas for random simplices. The theorem and its extensions are deeply linked to the stability of log-concavity under convolution, measure-theoretic inequalities, and the structure of nonparametric statistical estimators.

1. Efron's Monotonicity Theorem for Log-Concave Variables

Let X,YX, Y be independent real-valued random variables with log-concave densities gX(x)=eφX(x)g_X(x) = e^{-\varphi_X(x)} and gY(y)=eφY(y)g_Y(y) = e^{-\varphi_Y(y)}. For any measurable function Ψ:R2R\Psi: \mathbb{R}^2 \to \mathbb{R} nondecreasing in each coordinate, define

I(s)=E[Ψ(X,Y)X+Y=s].I(s) = \mathbb{E}\left[\Psi(X, Y)\mid X+Y = s\right].

Efron's theorem asserts that sI(s)s \mapsto I(s) is nondecreasing on R\mathbb{R} (Saumard et al., 2014, Saumard et al., 2017, Oudghiri, 2020). This monotonicity also extends to conditional survival functions and to expectations of nondecreasing functions of XX or YY given the sum.

The result generalizes straightforwardly:

  • For any mm-tuple of independent log-concave (X1,...,Xm)(X_1, ..., X_m) and Φ:RmR\Phi: \mathbb{R}^m \to \mathbb{R} coordinate-wise nondecreasing, the mapping zE[Φ(X1,...,Xm)i=1mXi=z]z \mapsto \mathbb{E}[\Phi(X_1, ..., X_m) \mid \sum_{i=1}^m X_i = z] is nondecreasing [(Saumard et al., 2014), §2].

2. Proof Methods and Kernel-Covariance Formulations

Efron's theorem admits several proof strategies. The classical approach employs kernel representations for covariance, notably the Hoeffding–Shorack identity: $\Cov[a(X), b(Y)] = \iint (H(x, y) - F(x) G(y))\, da(x)\, db(y),$ where F,GF,G are the marginals and HH the joint cdf (Saumard et al., 2017). For absolutely continuous a,ba,b, this yields double-integral kernel forms.

A modern approach utilizes the asymmetric Brascamp–Lieb inequality (Saumard et al., 2014). For strictly log-concave densities and Φ\Phi as above,

$g'(z) = \mathbb{E}\left[ \partial_1 \Phi(X, Y)\mid X+Y=z\right] - \Cov\left( \Phi(X, Y), \varphi_X'(X) \mid X+Y = z\right),$

with nonnegativity governed by covariance inequalities: $|\Cov(\Phi, \varphi_X'(X)\mid S=z)| \leq \mathbb{E}[\partial_1\Phi \mid S=z].$ This is a direct application of one-dimensional Brascamp–Lieb–Otto–Menz covariance bounds.

In more general settings, (Saumard et al., 2017) develops kernel operators and domination conditions for dependent or non-log-concave (X,Y)(X,Y), providing sufficient conditions for monotonicity via the sign of mixed partial derivatives of φ(x,y)\varphi(x,y).

3. Extensions, Generalizations, and PF Classes

Oudghiri (Oudghiri, 2020) formalizes the theorem's scope through two key generalizations:

  • Restricted Efron's Theorem: For ψ:RR\psi:\mathbb{R}\to\mathbb{R} nondecreasing, the mappings sE[ψ(X)X+Y=s]s \mapsto \mathbb{E}[\psi(X)\mid X+Y=s] and sE[ψ(Y)X+Y=s]s \mapsto \mathbb{E}[\psi(Y)\mid X+Y=s] are nondecreasing. This restricted property is, in fact, equivalent to the full (strong) version of Efron's theorem.
  • Pólya-frequency Class (PFn\mathrm{PF}_n) Extension: For X,YX, Y independent with densities in PFn\mathrm{PF}_n (PF2=\mathrm{PF}_2= log-concave), and nn test functions {φk}\{\varphi_k\} forming nonnegative determinants, the nn-tuple of conditional expectations satisfies determinant monotonicity in ss. This extends monotonicity and convolution stability to higher-order total positivity classes.
  • Exponentially Tilted Generalizations: For independent log-concave X,YX, Y and φ(x,y)\varphi(x,y) with specific exponential tilts ea(x+y)e^{-a(x+y)}, seasE[φ(X,Y)X+Y=s]s \mapsto e^{-a s} \mathbb{E}[\varphi(X,Y) \mid X+Y=s] remains nondecreasing under coordinatized monotonicity. This unifies ordinary and tilted forms of Efron's theorem.

4. Preservation of Log-Concavity and Applications

Efron's theorem supplies the key step in demonstrating the closure of (strong) log-concavity under convolution. Given independent XpX \sim p, YqY \sim q (log-concave), the convolution pqp*q is also log-concave because the score φpq(z)=E[φp(X)X+Y=z]\varphi_{p*q}'(z) = \mathbb{E}[\varphi_p'(X)\mid X+Y=z] inherits monotonicity from the coordinatewise monotonicity property [(Saumard et al., 2014), §4]. In strong log-concavity, the relative score also propagates by an analogous conditional expectation.

This monotonicity principle underpins further results:

  • Prékopa–Leindler and Brunn–Minkowski type theorems for log-concave measures.
  • Sub-Gaussian concentration, log–Sobolev inequalities, and isoperimetric results.
  • Geometric ergodicity and efficient mixing for MCMC algorithms targeting log-concave (and strongly log-concave) densities.
  • In Bayesian analysis, Laplace approximations, saddle-point expansions, and convex optimization for hyperparameter estimation in log-concave models exploit these properties.

5. Generalizations to Dependent Measures and Quantitative Bounds

(Saumard et al., 2017) extends Efron's monotonicity to joint densities h(x,y)=exp{φ(x,y)}h(x,y) = \exp\{-\varphi(x,y)\}, including dependent (X,Y)(X,Y), via the conditions on mixed partials: 222φ(x,s0x)122φ(x,s0x)0,112φ(s0y,y)212φ(s0y,y)0\partial^2_{22}\varphi(x',s_0-x') - \partial^2_{12}\varphi(x',s_0-x') \ge 0, \quad \partial^2_{11}\varphi(s_0-y',y') - \partial^2_{21}\varphi(s_0-y',y') \ge 0 on the support. Under suitable domination conditions and smoothness, one obtains not only monotonicity but explicit lower bounds for derivatives of the conditional mean, relevant for quantitative stability in measurement-error models and regression.

Such extensions accommodate non-log-concave models, including certain copula structures, expanding the applicability of Efron's monotonicity property in modern probability, statistics, and econometrics.

6. Efron’s Theorem in Censored Survival Analysis

Efron's self-consistency principle forms a theoretical basis for nonparametric estimation under right-censoring (Strawderman et al., 2023). Consider i.i.d. (Xi,Di)(X_i, D_i), with Xi=min{Ti,Ui}X_i = \min\{T_i, U_i\}, Di=I{TiUi}D_i = I\{T_i \leq U_i\} for failure and censoring times. The survival function estimator S^(t)\widehat S(t) satisfies the fixed-point equation: S^(t)=S^0(t)+1ni=1n(1Di)I{Xit}S^(t)S^(Xi).\widehat S(t) = \widehat S_0(t) + \frac{1}{n} \sum_{i=1}^n (1-D_i) I\{X_i \leq t\} \frac{\widehat S(t)}{\widehat S(X_i)}. By reformulation, this yields an anticipating Volterra integral equation for the product-limit estimator of the censoring distribution, whose solution is expressed as a product integral—the classic Kaplan–Meier estimator: S^(t)=(0,t](1dN(u)Y(u)).\widehat S(t) = \prod_{(0, t]} \left( 1 - \frac{dN(u)}{Y(u)} \right). This self-consistency is equivalent to the “redistribution-to-the-right” algorithm and underpins the construction of inverse probability of censoring weighted (IPCW) estimators.

7. Impact and Significance

Efron's monotonicity theorem is a keystone in the modern theory of log-concave measures, total positivity, and statistical inference for censored and incomplete data. Its consequences extend to convex geometry, functional inequalities, sampling algorithms, and nonparametric survival analysis. The theorem and its numerous generalizations supply both core theoretical justification and practical methodology for stability results, estimator construction, and the analysis of stochastic models encompassing independence, dependence, and structured noise.

Key technical references include (Saumard et al., 2014, Saumard et al., 2017, Oudghiri, 2020), and (Strawderman et al., 2023); see also the foundational contributions by Efron (1965, 1967, 1969).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Efron's Theorem.