Variational inference and density estimation with non-negative tensor train (2507.21519v1)
Abstract: This work proposes an efficient numerical approach for compressing a high-dimensional discrete distribution function into a non-negative tensor train (NTT) format. The two settings we consider are variational inference and density estimation, whereby one has access to either the unnormalized analytic formula of the distribution or the samples generated from the distribution. In particular, the compression is done through a two-stage approach. In the first stage, we use existing subroutines to encode the distribution function in a tensor train format. In the second stage, we use an NTT ansatz to fit the obtained tensor train. For the NTT fitting procedure, we use a log barrier term to ensure the positivity of each tensor component, and then utilize a second-order alternating minimization scheme to accelerate convergence. In practice, we observe that the proposed NTT fitting procedure exhibits drastically faster convergence than an alternative multiplicative update method that has been previously proposed. Through challenging numerical experiments, we show that our approach can accurately compress target distribution functions.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.