Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

Black-Box Optimization with Implicit Constraints for Public Policy (2310.18449v5)

Published 27 Oct 2023 in stat.ML, cs.CE, and cs.LG

Abstract: Black-box optimization (BBO) has become increasingly relevant for tackling complex decision-making problems, especially in public policy domains such as police redistricting. However, its broader application in public policymaking is hindered by the complexity of defining feasible regions and the high-dimensionality of decisions. This paper introduces a novel BBO framework, termed as the Conditional And Generative Black-box Optimization (CageBO). This approach leverages a conditional variational autoencoder to learn the distribution of feasible decisions, enabling a two-way mapping between the original decision space and a simplified, constraint-free latent space. The CageBO efficiently handles the implicit constraints often found in public policy applications, allowing for optimization in the latent space while evaluating objectives in the original space. We validate our method through a case study on large-scale police redistricting problems in Atlanta, Georgia. Our results reveal that our CageBO offers notable improvements in performance and efficiency compared to the baselines.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Bayesian optimization in variational latent spaces with dynamic compression. In Kaelbling, L. P., Kragic, D., and Sugiura, K., editors, Proceedings of the Conference on Robot Learning, volume 100 of Proceedings of Machine Learning Research, pages 456–465. PMLR.
  2. Admmbo: Bayesian optimization with unknown constraints using admm. Journal of Machine Learning Research, 20(123):1–26.
  3. Binary, unrelaxable and hidden constraints in blackbox optimization. Operations Research Letters, 48(4):467–471.
  4. A survey on high-dimensional gaussian process modeling with application to bayesian optimization. ACM Transactions on Evolutionary Learning and Optimization, 2(2):1–26.
  5. Epidemiologically and socio-economically optimal policies via bayesian optimization. Transactions of the Indian National Academy of Engineering, 5:117–127.
  6. Offline contextual bayesian optimization. Advances in Neural Information Processing Systems, 32.
  7. Parallel-mentoring for offline model-based optimization. arXiv preprint arXiv:2309.11592.
  8. Optimization of automotive valve train components with implicit filtering. Optimization and Engineering, 1:9–27.
  9. Sample-then-optimize batch neural thompson sampling. Advances in Neural Information Processing Systems, 35:23331–23344.
  10. Combining latent space and structured kernels for bayesian optimization over combinatorial spaces. Advances in Neural Information Processing Systems, 34:8185–8200.
  11. Bayesian optimization and attribute adjustment. In Proc. 34th Conference on Uncertainty in Artificial Intelligence.
  12. Scalable constrained bayesian optimization. In Banerjee, A. and Fukumizu, K., editors, Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, volume 130 of Proceedings of Machine Learning Research, pages 730–738. PMLR.
  13. Bayesian optimization with inequality constraints. In ICML, volume 2014, pages 937–945.
  14. Bayesian optimization with unknown constraints. In 30th Conference on Uncertainty in Artificial Intelligence, UAI 2014, pages 250–259.
  15. Bayesian optimization for chemical reactions. Chimia, 77(1/2):31–31.
  16. An introduction to variational methods for graphical models. Machine learning, 37:183–233.
  17. Tuning hyperparameters without grad students: Scalable and robust bayesian optimisation with dragonfly. The Journal of Machine Learning Research, 21(1):3098–3124.
  18. Keane, A. (1994). Experiences with optimizers in structural design. In In Proceedings of the conference on adaptive computing in engineering design and control, volume 94, pages 14–27.
  19. Adam: A method for stochastic optimization. In International Conference on Learning Representations (ICLR), San Diega, CA, USA.
  20. Optimization by simulated annealing. science, 220(4598):671–680.
  21. Diffusion models for black-box optimization. arXiv preprint arXiv:2306.07180.
  22. Experimental testing of advanced scatter search designs for global optimization of multimodal functions. Journal of Global Optimization, 33:235–255.
  23. Larson, R. C. (1974). A hypercube queuing model for facility location and redistricting in urban emergency services. Computers & Operations Research, 1(1):67–95.
  24. Urban operations research.
  25. Constrained Bayesian Optimization with Noisy Experiments. Bayesian Analysis, 14(2):495 – 519.
  26. Global optimization with parametric function approximation. In International Conference on Machine Learning, pages 22113–22136. PMLR.
  27. Bayesian optimization with discrete variables. In AI 2019: Advances in Artificial Intelligence: 32nd Australasian Joint Conference, Adelaide, SA, Australia, December 2–5, 2019, Proceedings 32, pages 473–484. Springer.
  28. Local latent space bayesian optimization over structured inputs. Advances in Neural Information Processing Systems, 35:34505–34518.
  29. Black Box Optimization, Machine Learning, and No-Free Lunch Theorems. Springer.
  30. Variational autoencoder. In Variational Methods for Machine Learning with Applications to Deep Networks, pages 111–149. Springer.
  31. Seeger, M. (2004). Gaussian processes for machine learning. International journal of neural systems, 14(02):69–106.
  32. Understanding machine learning: From theory to algorithms. Cambridge university press.
  33. Shirabe, T. (2009). Districting modeling with exact contiguity constraints. Environment and Planning B: Planning and Design, 36(6):1053–1066.
  34. Deep unsupervised learning using nonequilibrium thermodynamics. In International conference on machine learning, pages 2256–2265. PMLR.
  35. Learning structured output representation using deep conditional generative models. Advances in neural information processing systems, 28.
  36. Gaussian process optimization in the bandit setting: No regret and experimental design. In Proceedings of the 27th International Conference on Machine Learning, pages 1015–1022.
  37. Gaussian process optimization in the bandit setting: No regret and experimental design. arXiv preprint arXiv:0912.3995.
  38. Bayesian optimization is superior to random search for machine learning hyperparameter tuning: Analysis of the black-box optimization challenge 2020. In NeurIPS 2020 Competition and Demonstration Track, pages 3–26. PMLR.
  39. Combo: An efficient bayesian optimization library for materials science. Materials discovery, 4:18–21.
  40. Data-driven policy making: The policy lab approach. In Electronic Participation: 9th IFIP WG 8.5 International Conference, ePart 2017, St. Petersburg, Russia, September 4-7, 2017, Proceedings 9, pages 100–111. Springer.
  41. A constrained latent variable model. In 2012 IEEE conference on computer vision and pattern recognition, pages 2248–2255. Ieee.
  42. Think global and act local: Bayesian optimisation over high-dimensional categorical and mixed search spaces. arXiv preprint arXiv:2102.07188.
  43. Gaussian processes for machine learning. MIT press Cambridge, MA.
  44. Landslide hazard assessment based on bayesian optimization–support vector machine in nanping city, china. Natural Hazards, 109(1):931–948.
  45. Optimal unit locations in emergency service systems with bayesian optimization. In INFORMS International Conference on Service Science, pages 439–452. Springer.
  46. Data-driven decision-making in covid-19 response: A survey. IEEE Transactions on Computational Social Systems, 8(4):1016–1029.
  47. Bayesian optimization for materials design with mixed quantitative and qualitative variables. Scientific reports, 10(1):4924.
  48. Data-driven optimization for police beat design in south fulton, georgia. arXiv preprint arXiv:2004.09660.
  49. Data-driven optimization for atlanta police-zone design. INFORMS Journal on Applied Analytics, 52(5):412–432.
  50. Crime event embedding with unsupervised feature selection. In ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 3922–3926. IEEE.
  51. Spatiotemporal-textual point processes for crime linkage detection. The Annals of Applied Statistics, 16(2):1151–1170.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.