Fisher Metric for CG Distributions
- The Fisher information metric for CG distributions defines a Riemannian structure on parameter spaces, enabling precise sensitivity and distinguishability analyses.
- It combines an affine-invariant metric for scatter matrices with a flat metric for texture parameters to yield a tractable, block-diagonal formulation.
- Explicit geodesic paths, distance formulas, and inversion operators support tight performance bounds and advanced applications in signal processing and statistical inference.
The Fisher information metric equips parameter spaces of statistical models with Riemannian geometric structures, enabling quantitative analyses of sensitivity, distinguishability, and optimization. Within the compound Gaussian (CG) family and related random matrix models, the Fisher metric admits explicit forms, leading to tractable distance functions, geodesics, and insights into statistical and physical phenomena.
1. Definition of the Fisher Information Metric for Compound Gaussian Distributions
A compound Gaussian (CG) distribution is defined for observed vectors , each modeled as , where is the unknown scatter (covariance) matrix and are non-negative scale ("texture") parameters. For identifiability, is often imposed, constraining to the manifold of unit determinant Hermitian positive definite matrices. The parameter space is the product manifold .
For a likelihood and tangent vectors , , the Fisher metric is given by
For the CG model, this becomes a block-diagonal metric, where
This structure arises naturally from combining the affine-invariant metric on the SPD manifold with a diagonal metric in the texture space (Bouchard et al., 2020).
2. Marginalization and the Fisher Information for Randomized CG Distributions
A related construction considers randomization by Wishart laws. If and, conditional on , , the marginal is
where is the multivariate gamma function (Letac, 2022). The Fisher information , with respect to , lies in the span of two explicit operators acting on the space of symmetric matrices: where and .
The coefficients are
3. Structure and Properties of the CG Fisher Information Metric
The block-diagonal structure of the Fisher metric on yields geodesically-complete manifolds:
- On : the metric is the affine-invariant metric, but scaled by $1/p$.
- On : each direction is flat, with metric weight .
Geodesics and distances decompose accordingly. The squared Riemannian distance between and is
where is the Frobenius norm and denotes element-wise multiplication.
The CG Fisher metric generalizes the affine-invariant geometry of the standard Gaussian manifold. The -block possesses non-positive sectional curvature, scaled down by compared to the classical Gaussian case, while the directions are exactly flat (Bouchard et al., 2020). This product structure enhances analytical tractability for optimization and statistical inference.
4. Inversion and Operator Formalism for CG Fisher Metrics
The Fisher information operator can be succinctly inverted due to its low-rank structure. For the Wishart-randomized case, the inverse operator is given by
with
This allows direct computation of lower bounds in Cramér-Rao and van Trees inequalities. The Riemannian metric associated with is, in explicit form,
exhibiting a linear combination of the affine-invariant and trace metrics. The vanishing cross-blocks between and reflect orthogonality in the Riemannian structure (Letac, 2022).
5. Geometric and Information-Theoretic Implications
The explicit Riemannian geometry of CG Fisher metrics provides a foundation for quantitative analyses in statistical signal processing and information geometry. The flattening of the curvature with increasing in the -block modulates distinguishability for high-dimensional models, while the precise product structure supports recursive estimation algorithms and Riemannian optimization schemes, as utilized in recursive change detection tasks (Bouchard et al., 2020).
Randomization of the scatter or scale parameters, for example by Wishart mixing, leads to exact expressions for Fisher information and its inverse. These explicit formulas yield tight performance bounds and offer analytic tools for studying the statistical efficiency of estimators in heavy-tailed or composite noise models (Letac, 2022).
6. Relation to Broader Information Geometry and Physics
The construction of Fisher information metrics on parameter spaces, including the CG family, connects fundamentally to the broader discipline of information geometry. For example, the Fisher metric serves as a core ingredient in reformulations of physical action principles, such as rewriting the two-dimensional Einstein-Hilbert action in statistical–information-theoretic terms, as demonstrated by expressing gravitational actions using Fisher metrics derived from underlying statistical ensembles (Takeuchi, 2018). In such settings, the geometric structure of the Fisher metric elucidates the mapping between statistical models and effective gravitational field theories, though certain constraints—such as normalization and component count of the metric—pose nontrivial challenges.
7. Limitations and Open Directions
The Fisher metric for CG distributions, while explicit and tractable, exhibits limitations rooted in its block-diagonal structure and parameterization:
- The constraint restricts the -component to a submanifold, omitting part of the general symmetric positive-definite cone.
- The metric's explicit block diagonalization misses possible off-diagonal couplings between scatter and scale parameters.
- In the context of information geometry-inspired physics, the reduced component count prevents full representation of arbitrary Riemannian metrics within the Fisher metric formalism unless the underlying statistical model is augmented (Takeuchi, 2018).
Despite these, explicit operator expressions and closed-form geodesics render the Fisher metric for CG distributions a powerful analytical tool in statistics, signal processing, and geometric analysis.