Dice Question Streamline Icon: https://streamlinehq.com

Closed-form Jensen–Shannon divergence for Gaussian distributions (arithmetic mean)

Derive a closed-form expression for the Jensen–Shannon divergence JS(P||Q) defined by JS(P||Q) = (1/2) KL(P || (P+Q)/2) + (1/2) KL(Q || (P+Q)/2) between two multivariate Gaussian distributions P = N(m0, C0) and Q = N(m1, C1) on R^n, where m0, m1 ∈ R^n and C0, C1 ∈ Sym^{++}(n).

Information Square Streamline Icon: https://streamlinehq.com

Background

The Jensen–Shannon divergence (JSD) is a widely used symmetric measure derived from the Kullback–Leibler divergence by mixing the two distributions using the arithmetic mean. For Gaussian distributions in finite dimensions, the arithmetic mixture (P+Q)/2 is not Gaussian, which complicates derivations and has prevented a closed-form expression from being established.

In contrast, the geometric mean leads to a Gaussian mixture and admits closed-form formulas, which the paper leverages to develop the Geometric Jensen–Shannon divergence. The explicit absence of a closed-form for the standard (arithmetic-mean) JSD for Gaussians motivates this open problem.

References

The arithmetic mean of two Gaussian measures $P,Q$ is not a Gaussian measure and, as of the current writing, no closed-form formula is known for $JS(P||Q) = JS_{A_{1/2}}(P||Q)$ (however, $JS(P||Q)$ admits a closed form formula if $P,Q$ are Cauchy distributions ).

Geometric Jensen-Shannon Divergence Between Gaussian Measures On Hilbert Space (2506.10494 - Quang et al., 12 Jun 2025) in Section 1, Introduction (Geometric Jensen-Shannon divergence between Gaussian measures on R^n)