Misspecified Bayesian Cramér–Rao Bound (MBCRB)
- MBCRB is a generalized Bayesian bound that quantifies estimator performance in the presence of model mismatch using pseudotrue parameter mapping.
- It leverages Bayesian Fisher information and covariance–score identities to derive rigorous lower bounds on estimator covariance, with closed-form expressions for linear–Gaussian models.
- MBCRB provides practical insights for model validation, selection, and estimator design by directly incorporating prior information and bias effects due to misspecification.
The Misspecified Bayesian Cramér–Rao Bound (MBCRB) generalizes the classical Bayesian Cramér–Rao bound (BCRB) to address situations where the working statistical model used by an estimator does not coincide with the true data-generating mechanism. The MBCRB quantifies the achievable estimation performance in parametric model-mismatch settings by introducing the concepts of pseudotrue parameter mapping and modified information measures, allowing rigorous lower bounds on estimator covariance that directly account for both assumed model and prior information. This formalism is applicable to a broad context and admits closed-form expressions for linear–Gaussian problems, supporting both model validation and selection (Tang et al., 2023).
1. True Versus Assumed Models and Model Mismatch
Fundamental to the construction of MBCRB is the recognition of two distinct statistical models: the "true" data-generation model and the "assumed" estimator model. The true model is specified by the joint pdf , with latent parameter and data . In contrast, the assumed model adopts a potentially misspecified joint pdf with parameter , where and need not coincide, nor must agree with . Model mismatch arises whenever , a practically ubiquitous scenario in statistical inference where simplified or incorrect parametric choices are made.
Key to MBCRB is the "pseudotrue parameter" mapping , defined as the minimizer in Kullback–Leibler divergence of the assumed joint model from the true conditional likelihood under the true prior on :
This mapping identifies, for each true value , the that best approximates in the KL sense.
2. Derivation of the MBCRB and Core Assumptions
The MBCRB lower bounds the covariance of any misspecified-unbiased (MS-unbiased) estimator , satisfying:
Under likelihood regularity and suitable prior boundary conditions (e.g., prior vanishes at the boundary of ensuring integration by parts is well-defined), the main tool is the Bayesian Fisher information matrix (BFIM) of :
with from the likelihood and from the prior.
The derivation leverages covariance–score identities and the Cauchy–Schwarz inequality to establish, for any and ,
Optimal choice of yields the matrix MBCRB inequality:
where is the Jacobian-averaged matrix. In expanded form,
When the model is correctly specified (, ), MBCRB reduces to the standard BCRB: .
3. Closed-Form MBCRB in Linear–Gaussian Models
In the case of linear–Gaussian systems, closed-form MBCRB expressions can be derived, illustrating its concrete implementation. The true model specifies with prior and observations . The assumed model posits , prior , and .
For this setting:
- The pseudotrue parameter is given by
- The constant Jacobian is
- The BFIM is
- The MBCRB adopts the sandwich form
- For bounding MSE relative to the true parameter, a bias term is added:
with .
The maximum a posteriori estimator under the assumed model is MS-unbiased and asymptotically attains the MBCRB.
4. Asymptotic Tightness, Special Cases, and Role of Mismatch
The MBCRB is asymptotically achieved by estimators such as the misspecified MAP (or QMLE+prior) under standard regularity conditions. Its tightness often persists in finite sample regimes where the standard BCRB can fail to provide meaningful bounds under misspecification.
In the special case of correct specification—where model parameters, priors, and measurement matrices align (, , )—the pseudotrue mapping satisfies , the Jacobian , and MBCRB collapses to the familiar .
Model mismatch—differences between and , and/or and —is explicitly captured in both the matrix and the bias term . Severe mismatch leads to a growing divergence between and , inflating both the MBCRB covariance bound and bias penalty above the correctly specified baseline (Tang et al., 2023).
5. Practical Implications for Bayesian Estimation and Model Selection
MBCRB addresses the need for rigorous performance benchmarks in real-world estimation tasks where model parameters are typically only approximately known or chosen for tractability. By providing a closed-form, attainable lower bound on estimator covariance (including prior effects and misspecification consequences), MBCRB serves as a quantitative design and validation tool. Model selection methodologies are thereby supported: competing models may be compared directly by evaluating their MBCRBs, facilitating risk assessment and optimization under practical constraints.
The MBCRB thus extends standard Bayesian inference theory, enabling the evaluation of estimation accuracy even in the presence of persistent model misspecification by means of analytically tractable and tight lower bounds.
6. Summary and Outlook
The Misspecified Bayesian Cramér–Rao Bound generalizes the BCRB framework to accommodate parametric model mismatch. It introduces the KL-optimal pseudotrue parameter mapping, utilizes a modified information matrix, and yields a transparent matrix inequality whose form reflects the extent of model mismatch and prior selection. For linear–Gaussian systems, fully closed-form formulas are available, supporting both theoretical analysis and practical estimator design in Bayesian signal processing and statistical inference (Tang et al., 2023). A plausible implication is that the MBCRB provides foundational guidance for estimator selection and model engineering whenever precise modeling of the data-generating process is infeasible.