Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Random Feature Expansions for Deep Gaussian Processes (1610.04386v2)

Published 14 Oct 2016 in stat.ML and stat.CO

Abstract: The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work, we introduce a novel formulation of DGPs based on random feature expansions that we train using stochastic variational inference. This yields a practical learning framework which significantly advances the state-of-the-art in inference for DGPs, and enables accurate quantification of uncertainty. We extensively showcase the scalability and performance of our proposal on several datasets with up to 8 million observations, and various DGP architectures with up to 30 hidden layers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Kurt Cutajar (8 papers)
  2. Edwin V. Bonilla (33 papers)
  3. Pietro Michiardi (58 papers)
  4. Maurizio Filippone (58 papers)
Citations (139)

Summary

We haven't generated a summary for this paper yet.