Affine calculus for constrained minima of the Kullback-Leibler divergence (2502.02177v3)
Abstract: The non-parametric version of Amari's dually affine Information Geometry provides a practical calculus to perform computations of interest in statistical machine learning. The method uses the notion of a statistical bundle, a mathematical structure that includes both probability densities and random variables to capture the spirit of Fisherian statistics. We focus on computations involving a constrained minimization of the Kullback-Leibler divergence. We show how to obtain neat and principled versions of known computation in applications such as mean-field approximation, adversarial generative models, and variational Bayes.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.