Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gaussian Process Regression with Soft Inequality and Monotonicity Constraints (2404.02873v1)

Published 3 Apr 2024 in stat.ML, cs.LG, and math.OC

Abstract: Gaussian process (GP) regression is a non-parametric, Bayesian framework to approximate complex models. Standard GP regression can lead to an unbounded model in which some points can take infeasible values. We introduce a new GP method that enforces the physical constraints in a probabilistic manner. This GP model is trained by the quantum-inspired Hamiltonian Monte Carlo (QHMC). QHMC is an efficient way to sample from a broad class of distributions. Unlike the standard Hamiltonian Monte Carlo algorithm in which a particle has a fixed mass, QHMC allows a particle to have a random mass matrix with a probability distribution. Introducing the QHMC method to the inequality and monotonicity constrained GP regression in the probabilistic sense, our approach improves the accuracy and reduces the variance in the resulting GP model. According to our experiments on several datasets, the proposed approach serves as an efficient method as it accelerates the sampling process while maintaining the accuracy, and it is applicable to high dimensional problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. Kriging with inequality constraints. Mathematical Geology, 33:719–744, 2001.
  2. Christian Agrell. Gaussian processes with linear operator inequality constraints. arXiv preprint arXiv:1901.03134, 2019.
  3. Sequential construction and dimension reduction of gaussian processes under inequality constraints. SIAM Journal on Mathematics of Data Science, 4(2):772–800, 2022.
  4. Monte Carlo Methods, volume 35. Springer, 2020.
  5. Gaussian process modeling with inequality constraints. In Annales de la Faculté des sciences de Toulouse: Mathématiques, volume 21, pages 529–555, 2012.
  6. Mixing-induced precipitation and porosity evolution in porous media. Advances in water resources, 28(4):337–344, 2005.
  7. Scalable constrained Bayesian optimization. In International Conference on Artificial Intelligence and Statistics, pages 730–738. PMLR, 2021.
  8. Bayesian Data Analysis. Tyler & Francis Group, Inc., 2014.
  9. Stability analysis for mathematical programs with distributionally robust chance constraint. SIAM J. Optim (to appear), 2015.
  10. Christian Hess. Conditional expectation and martingales of random sets. Pattern Recognition, 32(9):1543–1567, 1999.
  11. Bounded Gaussian process regression. In 2013 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), pages 1–6. IEEE, 2013.
  12. Gaussian processes in reinforcement learning. Advances in neural information processing systems, 16, 2003.
  13. Markus Lange-Hegermann. Linearly constrained Gaussian processes with boundary conditions. In International Conference on Artificial Intelligence and Statistics, pages 1090–1098. PMLR, 2021.
  14. An efficient, high-order probabilistic collocation method on sparse grids for three-dimensional flow and solute transport in randomly heterogeneous porous media. Advances in Water Resources, 32(5):712–722, 2009.
  15. Quantum-inspired Hamiltonian Monte Carlo for Bayesian sampling, 2020.
  16. High-dimensional additive gaussian processes under monotonicity constraints. Advances in Neural Information Processing Systems, 35:8041–8053, 2022.
  17. Finite-dimensional Gaussian approximation with linear inequality constraints. SIAM/ASA Journal on Uncertainty Quantification, 6(3):1224–1255, 2018.
  18. Gaussian process emulators for computer experiments with inequality constraints. Mathematical Geosciences, 49:557–582, 2017.
  19. Cross-validation estimations of hyper-parameters of Gaussian processes with inequality constraints. Procedia Environmental Sciences, 27:38–44, 2015.
  20. Nonnegativity-enforced Gaussian process regression. Theoretical and Applied Mechanics Letters, 10(3):182–187, 2020.
  21. Machine learning of linear differential equations using gaussian processes. Journal of Computational Physics, 348:683–693, 2017.
  22. Gaussian processes for machine learning, volume 1. Springer, 2006.
  23. Gaussian processes with monotonicity information. In Proceedings of the thirteenth international conference on artificial intelligence andf statistics, pages 645–652. JMLR Workshop and Conference Proceedings, 2010.
  24. Implicitly constrained Gaussian process regression for monocular non-rigid pose estimation. Advances in neural information processing systems, 23, 2010.
  25. Michael L Stein. Asymptotically efficient prediction of a random field with a misspecified covariance function. The Annals of Statistics, 16(1):55–63, 1988.
  26. A survey of constrained Gaussian process regression: Approaches and implementation challenges. Journal of Machine Learning for Modeling and Computing, 1(2), 2020.
  27. Physics-informed cokriging: A Gaussian-process-regression-based multifidelity method for data-model convergence. Journal of Computational Physics, 395:410–431, 2019.
  28. Physics information aided kriging using stochastic simulation models. SIAM Journal on Scientific Computing, 43(6):A3862–A3891, 2021.
  29. Hao Zhang. Inconsistent estimation and asymptotically equal interpolations in model-based geostatistics. Journal of the American Statistical Association, 99(465):250–261, 2004.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com