Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Wiener Filters in Gaussian Mixture Signal Estimation with Infinity-Norm Error (1405.4345v2)

Published 17 May 2014 in cs.IT and math.IT

Abstract: Consider the estimation of a signal ${\bf x}\in\mathbb{R}N$ from noisy observations ${\bf r=x+z}$, where the input~${\bf x}$ is generated by an independent and identically distributed (i.i.d.) Gaussian mixture source, and ${\bf z}$ is additive white Gaussian noise (AWGN) in parallel Gaussian channels. Typically, the $\ell_2$-norm error (squared error) is used to quantify the performance of the estimation process. In contrast, we consider the $\ell_\infty$-norm error (worst case error). For this error metric, we prove that, in an asymptotic setting where the signal dimension $N\to\infty$, the $\ell_\infty$-norm error always comes from the Gaussian component that has the largest variance, and the Wiener filter asymptotically achieves the optimal expected $\ell_\infty$-norm error. The i.i.d. Gaussian mixture case is easily applicable to i.i.d. Bernoulli-Gaussian distributions, which are often used to model sparse signals. Finally, our results can be extended to linear mixing systems with i.i.d. Gaussian mixture inputs, in settings where a linear mixing system can be decoupled to parallel Gaussian channels.

Citations (9)

Summary

We haven't generated a summary for this paper yet.