Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 87 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 37 tok/s Pro
GPT-4o 85 tok/s Pro
Kimi K2 183 tok/s Pro
GPT OSS 120B 419 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Fast increased fidelity approximate Gibbs samplers for Bayesian Gaussian process regression (2006.06537v1)

Published 11 Jun 2020 in math.ST, stat.ME, and stat.TH

Abstract: The use of Gaussian processes (GPs) is supported by efficient sampling algorithms, a rich methodological literature, and strong theoretical grounding. However, due to their prohibitive computation and storage demands, the use of exact GPs in Bayesian models is limited to problems containing at most several thousand observations. Sampling requires matrix operations that scale at $\mathcal{O}(n3),$ where $n$ is the number of unique inputs. Storage of individual matrices scales at $\mathcal{O}(n2),$ and can quickly overwhelm the resources of most modern computers. To overcome these bottlenecks, we develop a sampling algorithm using $\mathcal{H}$ matrix approximation of the matrices comprising the GP posterior covariance. These matrices can approximate the true conditional covariance matrix within machine precision and allow for sampling algorithms that scale at $\mathcal{O}(n \ \mbox{log}2 n)$ time and storage demands scaling at $\mathcal{O}(n \ \mbox{log} \ n).$ We also describe how these algorithms can be used as building blocks to model higher dimensional surfaces at $\mathcal{O}(d \ n \ \mbox{log}2 n)$, where $d$ is the dimension of the surface under consideration, using tensor products of one-dimensional GPs. Though various scalable processes have been proposed for approximating Bayesian GP inference when $n$ is large, to our knowledge, none of these methods show that the approximation's Kullback-Leibler divergence to the true posterior can be made arbitrarily small and may be no worse than the approximation provided by finite computer arithmetic. We describe $\mathcal{H}-$matrices, give an efficient Gibbs sampler using these matrices for one-dimensional GPs, offer a proposed extension to higher dimensional surfaces, and investigate the performance of this fast increased fidelity approximate GP, FIFA-GP, using both simulated and real data sets.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com