Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 62 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 14 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 93 tok/s Pro
Kimi K2 213 tok/s Pro
GPT OSS 120B 458 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Fusion of Gaussian Processes Predictions with Monte Carlo Sampling (2403.01389v1)

Published 3 Mar 2024 in cs.LG and stat.ML

Abstract: In science and engineering, we often work with models designed for accurate prediction of variables of interest. Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes. In this paper, we operate within the Bayesian paradigm, relying on Gaussian processes as our models. These models generate predictive probability density functions (pdfs), and the objective is to integrate them systematically, employing both linear and log-linear pooling. We introduce novel approaches for log-linear pooling, determining input-dependent weights for the predictive pdfs of the Gaussian processes. The aggregation of the pdfs is realized through Monte Carlo sampling, drawing samples of weights from their posterior. The performance of these methods, as well as those based on linear pooling, is demonstrated using a synthetic dataset.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
  1. “Fusion of probability density functions,” Proceedings of the IEEE, vol. 110, no. 4, pp. 404–453, 2022.
  2. T. G. Dietterich, “Ensemble methods in machine learning,” in International Workshop on Multiple Classifier Systems. Springer, 2000, pp. 1–15.
  3. Bayesian Theory, vol. 405, John Wiley & Sons, 2009.
  4. P. Domingos, “Bayesian averaging of classifiers and the overfitting problem,” in Proceedings of the Seventeenth International Conference on Machine Learning, 2000, pp. 223–230.
  5. T. P. Minka, “Bayesian model averaging is not model combination,” Tech. Rep., MIT Media Lab, 2000.
  6. D. H. Wolpert, “Stacked generalization,” Neural Networks, vol. 5, no. 2, pp. 241–259, 1992.
  7. “Using stacking to average bayesian predictive distributions (with discussion),” Bayesian Analysis, vol. 13, no. 3, pp. 917–1007, 2018.
  8. “Bayesian hierarchical stacking: Some models are (somewhere) useful,” Bayesian Analysis, vol. 17, no. 4, pp. 1043–1071, 2022.
  9. “Adaptive mixtures of local experts,” Neural Computation, vol. 3, no. 1, pp. 79–87, 1991.
  10. Gaussian Processes for Machine Learning, Adaptive Computation and Machine Learning series. MIT Press, 2005.
  11. A. Rahimi and B. Recht, “Random features for large-scale kernel machines,” Advances in Neural Information Processing Systems, vol. 20, 2007.
  12. “Sparse spectrum Gaussian process regression,” The Journal of Machine Learning Research, vol. 11, pp. 1865–1881, 2010.
  13. C. P. Robert and G. Casella, Monte Carlo Statistical Methods, Springer, 2004.
  14. “Hybrid Monte Carlo,” Physics Letters B, vol. 195, no. 2, pp. 216–222, 1987.
  15. V. Tresp, “Mixtures of Gaussian processes,” Advances in Neural Nnformation Processing Systems, vol. 13, 2000.
  16. “Regression with input-dependent noise: A Gaussian process treatment,” Advances in Neural Information Processing Systems, vol. 10, 1997.
  17. C. Genest, “A characterization theorem for externally Bayesian groups,” The Annals of Statistics, pp. 1100–1105, 1984.
  18. Y. Cao and D. J. Fleet, “Generalized product of experts for automatic and principled fusion of Gaussian process predictions,” arXiv preprint arXiv:1410.7827, 2014.
  19. M. Deisenroth and J. W. Ng, “Distributed Gaussian processes,” in International Conference on Machine Learning. PMLR, 2015, pp. 1481–1490.
  20. “Composable effects for flexible and accelerated probabilistic programming in NumPyro,” arXiv preprint 1912.11554, 2019.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube