Papers
Topics
Authors
Recent
Search
2000 character limit reached

The RKHS Approach to Minimum Variance Estimation Revisited: Variance Bounds, Sufficient Statistics, and Exponential Families

Published 24 Oct 2012 in math.ST and stat.TH | (1210.6516v2)

Abstract: The mathematical theory of reproducing kernel Hilbert spaces (RKHS) provides powerful tools for minimum variance estimation (MVE) problems. Here, we extend the classical RKHS based analysis of MVE in several directions. We develop a geometric formulation of five known lower bounds on the estimator variance (Barankin bound, Cramer-Rao bound, constrained Cramer-Rao bound, Bhattacharyya bound, and Hammersley-Chapman-Robbins bound) in terms of orthogonal projections onto a subspace of the RKHS associated with a given MVE problem. We show that, under mild conditions, the Barankin bound (the tightest possible lower bound on the estimator variance) is a lower semicontinuous function of the parameter vector. We also show that the RKHS associated with an MVE problem remains unchanged if the observation is replaced by a sufficient statistic. Finally, for MVE problems conforming to an exponential family of distributions, we derive novel closed-form lower bound on the estimator variance and show that a reduction of the parameter set leaves the minimum achievable variance unchanged.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.