Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convolutional Normalizing Flows for Deep Gaussian Processes (2104.08472v3)

Published 17 Apr 2021 in cs.LG

Abstract: Deep Gaussian processes (DGPs), a hierarchical composition of GP models, have successfully boosted the expressive power of their single-layer counterpart. However, it is impossible to perform exact inference in DGPs, which has motivated the recent development of variational inference-based methods. Unfortunately, either these methods yield a biased posterior belief or it is difficult to evaluate their convergence. This paper introduces a new approach for specifying flexible, arbitrarily complex, and scalable approximate posterior distributions. The posterior distribution is constructed through a normalizing flow (NF) which transforms a simple initial probability into a more complex one through a sequence of invertible transformations. Moreover, a novel convolutional normalizing flow (CNF) is developed to improve the time efficiency and capture dependency between layers. Empirical evaluation shows that CNF DGP outperforms the state-of-the-art approximation methods for DGPs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Haibin Yu (10 papers)
  2. Dapeng Liu (21 papers)
  3. Yizhou Chen (40 papers)
  4. Bryan Kian Hsiang Low (77 papers)
  5. Patrick Jaillet (100 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.