Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Active Level Set Estimation for Continuous Search Space with Theoretical Guarantee (2402.16237v1)

Published 26 Feb 2024 in cs.LG and cs.AI

Abstract: A common problem encountered in many real-world applications is level set estimation where the goal is to determine the region in the function domain where the function is above or below a given threshold. When the function is black-box and expensive to evaluate, the level sets need to be found in a minimum set of function evaluations. Existing methods often assume a discrete search space with a finite set of data points for function evaluations and estimating the level sets. When applied to a continuous search space, these methods often need to first discretize the space which leads to poor results while needing high computational time. While some methods cater for the continuous setting, they still lack a proper guarantee for theoretical convergence. To address this problem, we propose a novel algorithm that does not need any discretization and can directly work in continuous search spaces. Our method suggests points by constructing an acquisition function that is defined as a measure of confidence of the function being higher or lower than the given threshold. A theoretical analysis for the convergence of the algorithm to an accurate solution is provided. On multiple synthetic and real-world datasets, our algorithm successfully outperforms state-of-the-art methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. Truncated variance reduction: A unified approach to bayesian optimization and level-set estimation. In Proceedings of the 30th International Conference on Neural Information Processing Systems, NIPS’16, page 1515–1523, Red Hook, NY, USA, 2016. Curran Associates Inc.
  2. Active learning for identifying function threshold boundaries. In Proceedings of the 18th International Conference on Neural Information Processing Systems, NIPS’05, page 163–170, Cambridge, MA, USA, 2005. MIT Press.
  3. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing). Wiley-Interscience, USA, 2006.
  4. Algorithmic assurance: An active approach to algorithmic testing using bayesian optimisation. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018.
  5. Active learning for level set estimation. In Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence, IJCAI ’13, page 1344–1350. AAAI Press, 2013.
  6. High dimensional level set estimation with bayesian neural network. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 12095–12103, 2021.
  7. Livermore software technology corporation. Livermore, Ca, pages 94550–1740, 1998.
  8. Active learning for level set estimation under input uncertainty and its extensions. Neural Computation, 32(12):2486–2531, 2020.
  9. Active learning for distributionally robust level-set estimation. In International Conference on Machine Learning, pages 4574–4584. PMLR, 2021.
  10. Bayesian experimental design for finding reliable level set under input uncertainty. IEEE Access, 8:203982–203993, 2020.
  11. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86:2278 – 2324, 12 1998.
  12. Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning. The MIT Press, 11 2005.
  13. Level set estimation with search space warping. In Advances in Knowledge Discovery and Data Mining, pages 827–839, Cham, 2020. Springer International Publishing.
  14. Gaussian process bandits without regret: An experimental design approach. CoRR, abs/0912.3995, 01 2009.
  15. Robust super-level set estimation using gaussian processes. In Machine Learning and Knowledge Discovery in Databases, pages 276–291, Cham, 2019. Springer International Publishing.
Citations (1)

Summary

We haven't generated a summary for this paper yet.