Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution (1804.10151v1)
Abstract: Tight bounds on the minimum mean square error for the additive Gaussian noise channel are derived, when the input distribution is constrained to be epsilon-close to a Gaussian reference distribution in terms of the Kullback--Leibler divergence. The distributions that attain the bounds are shown be Gaussian whose means are identical to that of the reference distribution and whose covariance matrices are defined implicitly via systems of matrix equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound is shown to provide a potentially tighter alternative to the Cramer--Rao bound. Both properties are illustrated with numerical examples.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.