Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Combinatorial Optimization via Heat Diffusion (2403.08757v4)

Published 13 Mar 2024 in stat.ML, cs.LG, math.CO, and physics.app-ph

Abstract: Combinatorial optimization problems are widespread but inherently challenging due to their discrete nature. The primary limitation of existing methods is that they can only access a small fraction of the solution space at each iteration, resulting in limited efficiency for searching the global optimal. To overcome this challenge, diverging from conventional efforts of expanding the solver's search scope, we focus on enabling information to actively propagate to the solver through heat diffusion. By transforming the target function while preserving its optima, heat diffusion facilitates information flow from distant regions to the solver, providing more efficient navigation. Utilizing heat diffusion, we propose a framework for solving general combinatorial optimization problems. The proposed methodology demonstrates superior performance across a range of the most challenging and widely encountered combinatorial optimizations. Echoing recent advancements in harnessing thermodynamics for generative artificial intelligence, our study further reveals its significant potential in advancing combinatorial optimization.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. An application of combinatorial optimization to statistical physics and circuit layout design. Operations Research, 36(3):493–513, 1988.
  2. Semi-supervised learning using greedy max-cut. The Journal of Machine Learning Research, 14(1):771–800, 2013.
  3. An efficient graph cut algorithm for computer vision problems. In Computer Vision–ECCV 2010: 11th European Conference on Computer Vision, Heraklion, Crete, Greece, September 5-11, 2010, Proceedings, Part III 11, pages 552–565. Springer, 2010.
  4. Graph partitioning using quantum annealing on the d-wave system. In Proceedings of the Second International Workshop on Post Moores Era Supercomputing, pages 22–29, 2017.
  5. Traffic flow optimization using a quantum annealer. Frontiers in ICT, 4:29, 2017.
  6. Quantum computing for finance: Overview and prospects. Reviews in Physics, 4:100028, 2019.
  7. Exact and Heuristic Methods in Combinatorial Optimization, volume 175. Springer, 2022.
  8. A quantum adiabatic evolution algorithm applied to random instances of an np-complete problem. Science, 292(5516):472–475, 2001.
  9. Classical signature of quantum annealing. Frontiers in physics, 2:52, 2014.
  10. Quadratic unconstrained binary optimization via quantum-inspired annealing. Physical Review Applied, 18(3):034016, 2022.
  11. Optimization hardness as transient chaos in an analog approach to constraint satisfaction. Nature Physics, 7(12):966–970, 2011.
  12. Combinatorial optimization by simulating adiabatic bifurcations in nonlinear hamiltonian systems. Science advances, 5(4):eaav2372, 2019.
  13. High-performance combinatorial optimization based on classical mechanics. Science Advances, 7(6):eabe7953, 2021.
  14. A coherent ising machine for 2000-node optimization problems. Science, 354(6312):603–606, 2016.
  15. Annealing by simulating the coherent ising machine. Optics express, 27(7):10288–10295, 2019.
  16. Efficient optimization with higher-order ising machines. Nature Communications, 14(1):6033, 2023.
  17. Combinatorial optimization with physics-inspired graph neural networks. Nature Machine Intelligence, 4(4):367–377, 2022.
  18. Mathematical discoveries from program search with large language models. Nature, pages 1–3, 2023.
  19. Large neighborhood search. Handbook of metaheuristics, pages 99–127, 2019.
  20. Variable neighbourhood search: methods and applications. Annals of Operations Research, 175:367–407, 2010.
  21. Path auxiliary proposal for mcmc in discrete space. In International Conference on Learning Representations, 2021.
  22. Lawrence C Evans. Partial differential equations, volume 19. American Mathematical Society, 2022.
  23. Jean-Michel Ghidaglia. Some backward uniqueness results. Nonlinear Analysis: Theory, Methods & Applications, 10(8):777–790, 1986.
  24. Diffusion models: A comprehensive survey of methods and applications. ACM Computing Surveys, 56(4):1–39, 2023.
  25. Pseudo-boolean optimization. Discrete applied mathematics, 123(1-3):155–225, 2002.
  26. Boolean functions: Theory, algorithms, and applications. Cambridge University Press, 2011.
  27. Combinatorial optimization, volume 1. Springer, 2011.
  28. Monte carlo gradient estimation in machine learning. The Journal of Machine Learning Research, 21(1):5183–5244, 2020.
  29. Compressed quadratization of higher order binary optimization problems. In Proceedings of the 17th ACM International Conference on Computing Frontiers, pages 126–131, 2020.
  30. Quadratic reformulations of nonlinear binary optimization problems. Mathematical Programming, 162:115–144, 2017.
  31. Handbook of metaheuristics, volume 2. Springer, 2010.
  32. “neural” computation of decisions in optimization problems. Biological cybernetics, 52(3):141–152, 1985.
  33. George Cybenko. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems, 2(4):303–314, 1989.
  34. Coherent ising machine based on degenerate optical parametric oscillators. Physical Review A, 88(6):063853, 2013.
  35. Angelika Wiegele. Biq mac library—a collection of max-cut and quadratic 0-1 programming instances of medium size. Preprint, 51, 2007.
  36. Satlib: An online resource for research on sat. Sat, 2000:283–292, 2000.
  37. Andrew Lucas. Ising formulations of many np problems. Frontiers in physics, 2:5, 2014.
  38. A survey of quantization methods for efficient neural network inference. In Low-Power Computer Vision, pages 291–326. Chapman and Hall/CRC, 2022.
  39. Robert Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society Series B: Statistical Methodology, 58(1):267–288, 1996.
  40. L 1/2 regularization. Science China Information Sciences, 53:1159–1169, 2010.
  41. The elements of statistical learning: data mining, inference, and prediction, volume 2. Springer, 2009.
  42. A survey on sparse learning models for feature selection. IEEE transactions on cybernetics, 52(3):1642–1660, 2020.
  43. Simultaneous repression of multiple bacterial genes using nonrepetitive extra-long sgrna arrays. Nature biotechnology, 37(11):1294–1301, 2019.
  44. Finding a small vertex cover in massive sparse graphs: Construct, local search, and preprocess. Journal of Artificial Intelligence Research, 59:463–494, 2017.
  45. A tutorial on the cross-entropy method. Annals of operations research, 134:19–67, 2005.
  46. Oops i took a gradient: Scalable sampling for discrete distributions. In International Conference on Machine Learning, pages 3831–3841. PMLR, 2021.
  47. Difusco: Graph-based diffusion solvers for combinatorial optimization. Advances in neural information processing systems, 2023.
  48. Randomized smoothing for stochastic optimization. SIAM Journal on Optimization, 22(2):674–701, 2012.
  49. Erdos goes neural: an unsupervised learning framework for combinatorial optimization on graphs. Advances in Neural Information Processing Systems, 33:6659–6672, 2020.
  50. Unsupervised learning for combinatorial optimization with principled objective relaxation. Advances in Neural Information Processing Systems, 35:31444–31458, 2022.
  51. Toward understanding the importance of noise in training neural networks. In International Conference on Machine Learning, pages 7594–7602. PMLR, 2019.
  52. Backward uniqueness for general parabolic operators in the whole space. Calculus of Variations and Partial Differential Equations, 58:1–19, 2019.
  53. Accelerating score-based generative models with preconditioned diffusion sampling. In European Conference on Computer Vision, pages 1–16. Springer, 2022.
  54. Revisiting sampling for combinatorial optimization. In International Conference on Machine Learning, pages 32859–32874. PMLR, 2023.
  55. Bifurcation behaviors shape how continuous physical dynamics solves discrete ising optimization. Nature Communications, 14(1):2510, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Hengyuan Ma (8 papers)
  2. Wenlian Lu (58 papers)
  3. Jianfeng Feng (57 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.