Dice Question Streamline Icon: https://streamlinehq.com

Closed-form Jensen–Shannon divergence between two Gaussian distributions

Derive an analytic closed-form formula for the differential entropy of a two-component Gaussian mixture and thereby determine a closed-form expression for the Jensen–Shannon divergence between two Gaussian distributions.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper discusses the Jensen–Shannon divergence (JSD) and its geometric variants, noting that while many divergences admit closed-form expressions for Gaussian distributions, the ordinary JSD does not. The key obstacle is the differential entropy of a two-component Gaussian mixture, for which no analytic expression is currently known.

This lack of a closed-form expression forces practitioners to approximate the JSD numerically when comparing Gaussian distributions. Addressing this unresolved issue would enable exact computation of the JSD for Gaussian pairs and enhance the applicability of JSD in machine learning and information theory.

References

However, one drawback that refrains its use in practice is that the JSD between two Gaussian distributions is not known in closed-form since no analytic formula is known for the differential entropy of a two-component Gaussian mixture, and thus the JSD needs to be numerically approximated in practice by various methods.

Two tales for a geometric Jensen--Shannon divergence (2508.05066 - Nielsen, 7 Aug 2025) in Section 1 (Introduction)