High-Dimensional Gaussian Process Regression with Soft Kernel Interpolation
Abstract: We introduce Soft Kernel Interpolation (SoftKI), a method that combines aspects of Structured Kernel Interpolation (SKI) and variational inducing point methods, to achieve scalable Gaussian Process (GP) regression on high-dimensional datasets. SoftKI approximates a kernel via softmax interpolation from a smaller number of interpolation points learned by optimizing a combination of the SoftKI marginal log-likelihood (MLL), and when needed, an approximate MLL for improved numerical stability. Consequently, it can overcome the dimensionality scaling challenges that SKI faces when interpolating from a dense and static lattice while retaining the flexibility of variational methods to adapt inducing points to the dataset. We demonstrate the effectiveness of SoftKI across various examples and show that it is competitive with other approximated GP methods when the data dimensionality is modest (around 10).
- Fast high-dimensional filtering using the permutohedral lattice. Comput. Graph. Forum, 29:753ā762.
- Accurate global machine learning force fields for molecules with hundreds of atoms. Science Advances, 9(2):eadf0873.
- Stable and efficient gaussian process calculations. Journal of Machine Learning Research, 10(31):857ā882.
- Gpytorch: Blackbox matrix-matrix gaussian process inference with gpu acceleration.
- Product kernel interpolation for scalable gaussian processes.
- Efficient implementation of Gaussian processes for interpolation.
- Girard, A. (1989). A fast āmonte-carlo cross-validationā procedure for large least squares problems with noisy data. Numer. Math., 56(1):1ā23.
- Gaussian processes for big data. arXiv preprint arXiv:1309.6835.
- Hutchinson, M. (1989). A stochastic estimator of the trace of the influence matrix for laplacian smoothing splines. Communication in Statistics- Simulation and Computation, 18:1059ā1076.
- Skiing on simplices: Kernel interpolation on the permutohedral lattice for scalable gaussian processes. In International Conference on Machine Learning, pages 5279ā5289. PMLR.
- The uci machine learning repository. https://archive.ics.uci.edu.
- Keys, R. (1981). Cubic convolution interpolation for digital image processing. IEEE Transactions on Acoustics, Speech, and Signal Processing, 29(6):1153ā1160.
- Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
- Low-precision arithmetic for fast gaussian processes. In Uncertainty in Artificial Intelligence, pages 1306ā1316. PMLR.
- A unifying view of sparse approximate gaussian process regression. The Journal of Machine Learning Research, 6:1939ā1959.
- Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press.
- Sparse greedy gaussian process regression. In Leen, T., Dietterich, T., and Tresp, V., editors, Advances in Neural Information Processing Systems, volumeĀ 13. MIT Press.
- Sparse gaussian processes using pseudo-inputs. Advances in neural information processing systems, 18.
- Titsias, M. (2009). Variational learning of inducing variables in sparse gaussian processes. In Artificial intelligence and statistics, pages 567ā574. PMLR.
- Exact gaussian processes on a million data points.
- Using the nystrƶm method to speed up kernel machines. In Proceedings of the 13th International Conference on Neural Information Processing Systems, NIPSā00, page 661ā667, Cambridge, MA, USA. MIT Press.
- Kernel interpolation for scalable structured gaussian processes (kiss-gp). In International conference on machine learning, pages 1775ā1784. PMLR.
- Wilson, A.Ā G. (2014). Covariance kernels for fast automatic pattern discovery and extrapolation with Gaussian processes. PhD thesis, University of Cambridge Cambridge, UK.
- Kernel interpolation with sparse grids.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.