Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fundamental Linear Algebra Problem of Gaussian Inference

Published 15 Oct 2020 in cs.LG, cs.RO, math.ST, and stat.TH | (2010.08022v1)

Abstract: Underlying many Bayesian inference techniques that seek to approximate the posterior as a Gaussian distribution is a fundamental linear algebra problem that must be solved for both the mean and key entries of the covariance. Even when the true posterior is not Gaussian (e.g., in the case of nonlinear measurement functions) we can use variational schemes that repeatedly solve this linear algebra problem at each iteration. In most cases, the question is not whether a solution to this problem exists, but rather how we can exploit problem-specific structure to find it efficiently. Our contribution is to clearly state the Fundamental Linear Algebra Problem of Gaussian Inference (FLAPOGI) and to provide a novel presentation (using Kronecker algebra) of the not-so-well-known result of Takahashi et al. (1973) that makes it possible to solve for key entries of the covariance matrix. We first provide a global solution and then a local version that can be implemented using local message passing amongst a collection of agents calculating in parallel. Contrary to belief propagation, our local scheme is guaranteed to converge in both the mean and desired covariance quantities to the global solution even when the underlying factor graph is loopy; in the case of synchronous updates, we provide a bound on the number of iterations required for convergence. Compared to belief propagation, this guaranteed convergence comes at the cost of additional storage, calculations, and communication links in the case of loops; however, we show how these can be automatically constructed on the fly using only local information.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.