Papers
Topics
Authors
Recent
2000 character limit reached

Bell Polynomial Recursions in Denoising

Updated 11 December 2025
  • Bell Polynomial Recursions are a combinatorial framework that recursively builds higher-order denoisers by encoding corrections via score functions and derivatives.
  • They systematically control approximation errors in optimal transport maps, achieving explicit error rates such as O(σ^(2(K+1))) in statistical signal recovery.
  • Their computation leverages partial Bell polynomials to inductively isolate map coefficients, enabling practical, data-driven nonparametric denoising.

Bell polynomial recursions arise as a combinatorial and analytic framework underlying the construction of hierarchical or higher-order denoisers in empirical Bayes and optimal transport approaches to statistical signal recovery. They encode the recursive structure of solutions to differential, moment-matching, and transport equations, enabling explicit formulæ for map corrections in series expansions in terms of higher-order score functions. This hierarchy has become prominent in recent denoising theory, particularly for the sequence of optimal transport denoisers interpolating between the observed noisy law and the unobserved clean distribution. These recursions are now central in describing, analyzing, and estimating distributional denoising maps from pure noisy data.

1. Definition and Emergence of Bell Polynomial Recursions

Bell polynomials Bn,pB_{n,p} are a family of combinatorial polynomials that enumerate the number of ways a set can be partitioned into subsets of given sizes, or algebraically, encode the expansion coefficients of composite function derivatives. In the denoising context, they first appear as the organizing principle in the series expansion of the optimal transport map (OT-map) from the noisy distribution QQ to the clean distribution PP, with the form

TK(y)=y+k=1Kηkk!hk(y),T_K(y) = y + \sum_{k=1}^{K} \frac{\eta^k}{k!} h_k(y),

where each correction term hkh_k is a polynomial in higher-order scores evaluated at yy and defined recursively by Bell polynomial combinations of previous hjh_j (Liang, 10 Dec 2025).

The core formal recursion is: =0k1(1)k!(k)!!j=1kG(2+j)(y)Bk,j(h1,...,hkj+1)(y)+(1)kG(2k)(y)=0,\sum_{\ell=0}^{k-1} (-1)^\ell\frac{k!}{(k-\ell)!\,\ell!} \sum_{j=1}^{k-\ell} G^{(2\ell+j)}(y) B_{k-\ell,j}(h_1, ..., h_{k-\ell-j+1})(y) + (-1)^k G^{(2k)}(y) = 0, in which the only occurrence of hkh_k is isolated by setting =0,j=1\ell=0, j=1, yielding an explicit recursive formula with all terms expressed as Bell polynomials of lower-order corrections and derivatives of the observed density.

2. Hierarchy of Denoisers and Role of Bell Recursions

In the scalar additive Gaussian noise model Y=X+σZY = X + \sigma Z, with QQ the observed law and qq its density, the Bell polynomial recursion constructs a denoiser hierarchy T0,T1,...,TT_0, T_1, ..., T_\infty. The kk-th correction term hkh_k depends polynomially on the set of score functions up to order $2k-1$: sm(y)=q(m)(y)q(y),1m2k1.s_m(y) = \frac{q^{(m)}(y)}{q(y)}, \qquad 1 \leq m \leq 2k-1. The leading term T1T_1 resembles the half-shrinkage “optimal transport” denoiser while all higher-order terms systematically address moment and density mismatches of order O(σ2(K+1))O(\sigma^{2(K+1)}) (Liang, 12 Nov 2025, Liang, 10 Dec 2025).

Bell recursions thus provide the mechanism to express each subsequent correction as an explicit function of qq and its derivatives without reference to PP, a critical property for agnostic, empirical-Bayes denoising.

3. Combinatorial and Analytic Structure

The partial Bell polynomials Bn,p(x1,...,xnp+1)B_{n,p}(x_1, ..., x_{n-p+1}) organize the nonlinearity arising from repeated application of chain and product rules in the derivatives of composite functions (through Faa di Bruno's formula). In the optimal transport denoiser expansion, this structure embodies the effects of the nonlinear push-forward and density change under transformation, with the expansion of the OT-map formally written as: TG(y)=y+k=1ηkk!hk(y),T_G(y) = y + \sum_{k=1}^{\infty} \frac{\eta^k}{k!} h_k(y), and each hkh_k recursively built using Bell polynomials in previous hjh_j (Liang, 10 Dec 2025). The recursion's solution ensures that, to each truncation order, the map TKT_K matches the smooth moments and density up to O(σ2(K+1))O(\sigma^{2(K+1)}), with the error terms made explicit by the structure of the recursions.

4. Construction, Solution, and Computational Aspects

The Bell polynomial recursion enables a systematic inductive computation of the map coefficients. For example, to compute hkh_k:

  1. All terms involving =0,j=1\ell = 0, j=1 are separated to isolate hkh_k.
  2. All remaining terms are expressed as combinations of derivatives G(m)(y)G^{(m)}(y) and previously determined hj<kh_{j < k}, themselves Bell polynomials in the lower-order scores.
  3. The solution for hk(y)h_k(y) is

hk(y)=1G(1)(y){=1k1(1)k!(k)!!j=1kG(2+j)(y)Bk,j(h1,...,hkj+1)+(1)kG(2k)(y)}.h_k(y) = -\frac{1}{G^{(1)}(y)} \left\{ \sum_{\ell=1}^{k-1} (-1)^\ell\frac{k!}{(k-\ell)!\,\ell!} \sum_{j=1}^{k-\ell} G^{(2\ell+j)}(y) B_{k-\ell,j}(h_1, ..., h_{k-\ell-j+1}) + (-1)^k G^{(2k)}(y) \right\}.

This produces explicit symbolic formulas or computational graphs for practical evaluation. All derivatives can be computed from data using score-matching or kernel estimation (Liang, 10 Dec 2025).

5. Statistical Interpretation and Estimation

The utility of the Bell recursion is statistical: each higher-order correction corresponds to improved control on the Wasserstein-rr distance between the push-forward law TKQT_K \sharp Q and the unknown clean law PP, yielding exponentially decaying error in the expansion order: Wr(TKQ,P)ηK+1=O(σ2(K+1)).W_r(T_K \sharp Q, P) \precsim \eta^{K+1} = O(\sigma^{2(K+1)}). Empirical implementation only requires accurate nonparametric estimation of the score sequence {sm}\{s_m\} up to order $2K-1$. Both kernel-based plug-in estimators and direct higher-order score-matching estimators (with optimal rates under smoothness constraints) can supply the necessary ingredients for high-order Bell-recursive denoising in practice (Liang, 10 Dec 2025).

The Bell polynomial recursion fundamentally underpins the new hierarchy of distributional shrinkage denoisers, providing the algebraic foundation for the progression beyond classical Tweedie (empirical Bayes) denoisers or first-order optimal transport maps. The hierarchy can be recapitulated as:

Level Denoiser TKT_K Score Functions Needed Asymptotic Error
0 Identity none O(1)O(1)
1 T1(y)=y+ηs1(y)T_1(y) = y+\eta s_1(y) first derivative O(η2)O(\eta^2)
KK Bell-recursive OT map up to order $2K-1$ O(ηK+1)O(\eta^{K+1})
\infty Full OT map all orders Exact transport TT_\infty

This framework has broader implications in constructing functionally agnostic, theoretically optimal denoisers with sharp control over the target distribution at all levels of approximation (Liang, 12 Nov 2025, Liang, 10 Dec 2025).

7. Implications for Theory and Practice

Bell polynomial recursions represent more than a technical mechanism—they define the pathway from classical pointwise and plug-in denoising methods to fully distributional, nonparametric, and structure-exploiting approaches. Their inductive and hierarchical algebraic assembly indexes the entire optimal transport family of denoisers by powers of the noise, systematically correcting for higher-order biases. This approach enables provable, explicit error rates in recovering the latent distribution and offers concrete, data-driven algorithms for constructing denoisers entirely from noisy samples, with statistical guarantees dictated by the recursion's order.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (2)

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Bell Polynomial Recursions.