Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Polynomial Approximations of Conditional Expectations in Scalar Gaussian Channels (2102.05970v1)

Published 11 Feb 2021 in cs.IT, math.IT, and math.PR

Abstract: We consider a channel $Y=X+N$ where $X$ is a random variable satisfying $\mathbb{E}[|X|]<\infty$ and $N$ is an independent standard normal random variable. We show that the minimum mean-square error estimator of $X$ from $Y,$ which is given by the conditional expectation $\mathbb{E}[X \mid Y],$ is a polynomial in $Y$ if and only if it is linear or constant; these two cases correspond to $X$ being Gaussian or a constant, respectively. We also prove that the higher-order derivatives of $y \mapsto \mathbb{E}[X \mid Y=y]$ are expressible as multivariate polynomials in the functions $y \mapsto \mathbb{E}\left[ \left( X - \mathbb{E}[X \mid Y] \right)k \mid Y = y \right]$ for $k\in \mathbb{N}.$ These expressions yield bounds on the $2$-norm of the derivatives of the conditional expectation. These bounds imply that, if $X$ has a compactly-supported density that is even and decreasing on the positive half-line, then the error in approximating the conditional expectation $\mathbb{E}[X \mid Y]$ by polynomials in $Y$ of degree at most $n$ decays faster than any polynomial in $n.$

Citations (5)

Summary

We haven't generated a summary for this paper yet.