Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fisher Metric for CG Distributions

Updated 27 January 2026
  • The Fisher information metric for CG distributions defines a Riemannian structure on parameter spaces, enabling precise sensitivity and distinguishability analyses.
  • It combines an affine-invariant metric for scatter matrices with a flat metric for texture parameters to yield a tractable, block-diagonal formulation.
  • Explicit geodesic paths, distance formulas, and inversion operators support tight performance bounds and advanced applications in signal processing and statistical inference.

The Fisher information metric equips parameter spaces of statistical models with Riemannian geometric structures, enabling quantitative analyses of sensitivity, distinguishability, and optimization. Within the compound Gaussian (CG) family and related random matrix models, the Fisher metric admits explicit forms, leading to tractable distance functions, geodesics, and insights into statistical and physical phenomena.

1. Definition of the Fisher Information Metric for Compound Gaussian Distributions

A compound Gaussian (CG) distribution is defined for observed vectors x1,,xnCpx_1, \ldots, x_n \in \mathbb{C}^p, each modeled as xiCN(0,τiΣ)x_i\sim \mathcal{CN}(0, \tau_i \Sigma), where ΣHp++\Sigma \in \mathcal{H}_p^{++} is the unknown scatter (covariance) matrix and τ=[τ1,...,τn]R+n\tau=[\tau_1, ..., \tau_n]^\top\in\mathbb{R}_+^n are non-negative scale ("texture") parameters. For identifiability, detΣ=1\det\Sigma=1 is often imposed, constraining Σ\Sigma to the manifold SHp++\mathcal{S}\mathcal{H}_p^{++} of unit determinant Hermitian positive definite matrices. The parameter space is the product manifold Mp,n=SHp++×R+n\mathcal{M}_{p,n} = \mathcal{S}\mathcal{H}_p^{++} \times \mathbb{R}_+^n.

For a likelihood (Σ,τ)\ell(\Sigma, \tau) and tangent vectors ξ=(ξΣ,ξτ)\xi=(\xi_\Sigma, \xi_\tau), η=(ηΣ,ητ)\eta=(\eta_\Sigma, \eta_\tau), the Fisher metric is given by

ξ,ηθ=E[ξη].\langle \xi, \eta \rangle_\theta = \mathbb{E}\left[ \partial_\xi \ell \cdot \partial_\eta \ell \right].

For the CG model, this becomes a block-diagonal metric, where

ξ,ηθ=1ptr(Σ1ξΣΣ1ηΣ)+1ni=1nξτ,iητ,iτi2.\langle \xi, \eta \rangle_\theta = \frac{1}{p} \operatorname{tr}\left(\Sigma^{-1}\xi_\Sigma \Sigma^{-1}\eta_\Sigma\right) + \frac{1}{n}\sum_{i=1}^n \frac{\xi_{\tau,i}\eta_{\tau,i}}{\tau_i^2}.

This structure arises naturally from combining the affine-invariant metric on the SPD manifold with a diagonal metric in the texture space (Bouchard et al., 2020).

2. Marginalization and the Fisher Information for Randomized CG Distributions

A related construction considers randomization by Wishart laws. If UWishartn(p,σ)U \sim \text{Wishart}_n(p, \sigma) and, conditional on UU, XUNn(0,U1)X|U \sim \mathcal{N}_n(0, U^{-1}), the marginal fp,σ(x)f_{p,\sigma}(x) is

fp,σ(x)=(2π)n/2Γn(p+n+12)Γn(p)(detσ)(n+1)/2(1+12xσ1x)(p+n+12),f_{p,\sigma}(x) = (2\pi)^{-n/2}\frac{\Gamma_n\left(p+\frac{n+1}{2}\right)}{\Gamma_n(p)}(\det\sigma)^{(n+1)/2}\left(1 + \frac{1}{2}x^\top\sigma^{-1}x\right)^{-\left(p+\frac{n+1}{2}\right)},

where Γn\Gamma_n is the multivariate gamma function (Letac, 2022). The Fisher information Ip(σ)I_p(\sigma), with respect to σ\sigma, lies in the span of two explicit operators acting on the space of symmetric matrices: Ip(σ)=α1(p,n)P(σ1)+β1(p,n)[σ1σ1],I_p(\sigma) = \alpha_1(p,n) \mathcal{P}(\sigma^{-1}) + \beta_1(p,n)[\sigma^{-1}\otimes\sigma^{-1}], where P(A)(H)=AHA\mathcal{P}(A)(H) = A H A and (AA)(H)=tr(AH)A(A\otimes A)(H) = \operatorname{tr}(A H)A.

The coefficients are

α1(p,n)=22p+32p+1,β1(p,n)=12p+1.\alpha_1(p,n) = 2\frac{2p+3}{2p+1}, \qquad \beta_1(p,n) = -\frac{1}{2p+1}.

3. Structure and Properties of the CG Fisher Information Metric

The block-diagonal structure of the Fisher metric on Mp,n\mathcal{M}_{p,n} yields geodesically-complete manifolds:

  • On ΣSHp++\Sigma \in \mathcal{S}\mathcal{H}_p^{++}: the metric is the affine-invariant metric, but scaled by $1/p$.
  • On τR+n\tau \in \mathbb{R}_+^n: each direction is flat, with metric weight 1/(nτi2)1/(n\tau_i^2).

Geodesics and distances decompose accordingly. The squared Riemannian distance between (Σ0,τ0)(\Sigma_0, \tau_0) and (Σ1,τ1)(\Sigma_1, \tau_1) is

d2((Σ0,τ0),(Σ1,τ1))=1plog(Σ01/2Σ1Σ01/2)F2+1nlog(τ0(1)τ1)22,d^2((\Sigma_0, \tau_0), (\Sigma_1, \tau_1)) = \frac{1}{p} \| \log(\Sigma_0^{-1/2} \Sigma_1 \Sigma_0^{-1/2}) \|_F^2 + \frac{1}{n} \| \log(\tau_0^{\odot (-1)} \odot \tau_1) \|_2^2,

where F\|\,\cdot\,\|_F is the Frobenius norm and \odot denotes element-wise multiplication.

The CG Fisher metric generalizes the affine-invariant geometry of the standard Gaussian manifold. The Σ\Sigma-block possesses non-positive sectional curvature, scaled down by p/2p/2 compared to the classical Gaussian case, while the τ\tau directions are exactly flat (Bouchard et al., 2020). This product structure enhances analytical tractability for optimization and statistical inference.

4. Inversion and Operator Formalism for CG Fisher Metrics

The Fisher information operator can be succinctly inverted due to its low-rank structure. For the Wishart-randomized case, the inverse operator is given by

Ip(σ)1=A(p,n)P(σ)+B(p,n)[σσ]I_p(\sigma)^{-1} = A(p,n) \mathcal{P}(\sigma) + B(p,n)[\sigma\otimes\sigma]

with

A(p,n)=2p+12(2p+3),B(p,n)=12p+1n.A(p,n) = \frac{2p+1}{2(2p+3)}, \qquad B(p,n) = \frac{1}{2p+1-n}.

This allows direct computation of lower bounds in Cramér-Rao and van Trees inequalities. The Riemannian metric associated with Ip(σ)I_p(\sigma) is, in explicit form,

gp,σ(H,K)=22p+32p+1tr(σ1Hσ1K)12p+1tr(σ1H)tr(σ1K),g_{p,\sigma}(H,K) = 2\frac{2p+3}{2p+1}\operatorname{tr}(\sigma^{-1} H \sigma^{-1} K) - \frac{1}{2p+1} \operatorname{tr}(\sigma^{-1} H) \operatorname{tr}(\sigma^{-1} K),

exhibiting a linear combination of the affine-invariant and trace metrics. The vanishing cross-blocks between Σ\Sigma and τ\tau reflect orthogonality in the Riemannian structure (Letac, 2022).

5. Geometric and Information-Theoretic Implications

The explicit Riemannian geometry of CG Fisher metrics provides a foundation for quantitative analyses in statistical signal processing and information geometry. The flattening of the curvature with increasing pp in the Σ\Sigma-block modulates distinguishability for high-dimensional models, while the precise product structure supports recursive estimation algorithms and Riemannian optimization schemes, as utilized in recursive change detection tasks (Bouchard et al., 2020).

Randomization of the scatter or scale parameters, for example by Wishart mixing, leads to exact expressions for Fisher information and its inverse. These explicit formulas yield tight performance bounds and offer analytic tools for studying the statistical efficiency of estimators in heavy-tailed or composite noise models (Letac, 2022).

6. Relation to Broader Information Geometry and Physics

The construction of Fisher information metrics on parameter spaces, including the CG family, connects fundamentally to the broader discipline of information geometry. For example, the Fisher metric serves as a core ingredient in reformulations of physical action principles, such as rewriting the two-dimensional Einstein-Hilbert action in statistical–information-theoretic terms, as demonstrated by expressing gravitational actions using Fisher metrics derived from underlying statistical ensembles (Takeuchi, 2018). In such settings, the geometric structure of the Fisher metric elucidates the mapping between statistical models and effective gravitational field theories, though certain constraints—such as normalization and component count of the metric—pose nontrivial challenges.

7. Limitations and Open Directions

The Fisher metric for CG distributions, while explicit and tractable, exhibits limitations rooted in its block-diagonal structure and parameterization:

  • The constraint detΣ=1\det\Sigma=1 restricts the Σ\Sigma-component to a submanifold, omitting part of the general symmetric positive-definite cone.
  • The metric's explicit block diagonalization misses possible off-diagonal couplings between scatter and scale parameters.
  • In the context of information geometry-inspired physics, the reduced component count prevents full representation of arbitrary Riemannian metrics within the Fisher metric formalism unless the underlying statistical model is augmented (Takeuchi, 2018).

Despite these, explicit operator expressions and closed-form geodesics render the Fisher metric for CG distributions a powerful analytical tool in statistics, signal processing, and geometric analysis.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Fisher Information Metric for CG Distributions.