2000 character limit reached
A note on the unique properties of the Kullback--Leibler divergence for sampling via gradient flows (2507.04330v1)
Published 6 Jul 2025 in stat.ME, cs.LG, math.ST, stat.CO, and stat.TH
Abstract: We consider the problem of sampling from a probability distribution $\pi$. It is well known that this can be written as an optimisation problem over the space of probability distribution in which we aim to minimise a divergence from $\pi$. and The optimisation problem is normally solved through gradient flows in the space of probability distribution with an appropriate metric. We show that the Kullback--Leibler divergence is the only divergence in the family of Bregman divergences whose gradient flow w.r.t. many popular metrics does not require knowledge of the normalising constant of $\pi$.