Papers
Topics
Authors
Recent
Search
2000 character limit reached

Amortised Variational Inference

Updated 27 April 2026
  • Amortised Variational Inference is a technique that replaces per-instance optimization with a global neural network to perform efficient approximate Bayesian inference in latent variable models.
  • It leverages the evidence lower bound (ELBO) to jointly train inference and generative networks, enhancing scalability and enabling single forward-pass inference for unseen data.
  • AVI finds applications in deep generative modeling, structured Bayesian inference, and meta-learning, offering practical benefits for fast, scalable probabilistic reasoning.

Amortised Variational Inference (AVI) is a core methodology in contemporary probabilistic machine learning that enables efficient, scalable, and flexible approximate inference in complex latent variable models. By replacing instance-specific optimization of variational parameters with a shared parametric inference mechanism—typically a neural network—AVI is foundational to deep generative modeling, structured Bayesian inference, and probabilistic meta-learning. Recent advances address its theoretical guarantees, computational trade-offs, and algorithmic extensions, with methodological and empirical progress spanning hierarchical Bayesian models, dynamical systems, meta-learning, and stochastic processes.

1. Conceptual Foundation and Mathematical Framework

Amortised Variational Inference seeks to approximate intractable posterior distributions p(zx)p(z|x) in latent variable models by introducing a parameterized family of tractable distributions qϕ(zx)q_\phi(z|x) and optimizing the reverse Kullback–Leibler divergence via maximization of the evidence lower bound (ELBO): L(ϕ,θ;x)=Eqϕ(zx)[logpθ(x,z)logqϕ(zx)].\mathcal{L}(\phi, \theta; x) = \mathbb{E}_{q_\phi(z|x)}[\log p_\theta(x, z) - \log q_\phi(z|x)]. Instead of instantiating and optimising local parameters for each xx (“classical” mean-field VI), AVI employs a global inference network (encoder) ϕ\phi, typically a neural network mapping xx to the parameters of qϕ(zx)q_\phi(z|x) (Ganguly et al., 2022). This network is trained jointly with the generative model pθ(xz)p_\theta(x|z) such that, for unseen xx^*, variational inference can be executed in a single forward pass with no per-instance optimization.

The ELBO, optimized with stochastic re

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Amortised Variational Inference (AVI).