Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Generative Neural Annealer for Black-Box Combinatorial Optimization (2505.09742v1)

Published 14 May 2025 in cs.LG, cond-mat.dis-nn, cond-mat.stat-mech, cs.AI, and cs.NE

Abstract: We propose a generative, end-to-end solver for black-box combinatorial optimization that emphasizes both sample efficiency and solution quality on NP problems. Drawing inspiration from annealing-based algorithms, we treat the black-box objective as an energy function and train a neural network to model the associated Boltzmann distribution. By conditioning on temperature, the network captures a continuum of distributions--from near-uniform at high temperatures to sharply peaked around global optima at low temperatures--thereby learning the structure of the energy landscape and facilitating global optimization. When queries are expensive, the temperature-dependent distributions naturally enable data augmentation and improve sample efficiency. When queries are cheap but the problem remains hard, the model learns implicit variable interactions, effectively "opening" the black box. We validate our approach on challenging combinatorial tasks under both limited and unlimited query budgets, showing competitive performance against state-of-the-art black-box optimizers.

Summary

A Generative Neural Annealer for Black-Box Combinatorial Optimization: A Detailed Examination

The paper "A Generative Neural Annealer for Black-Box Combinatorial Optimization" by Yuan-Hang Zhang and Massimiliano Di Ventra proposes an innovative approach to tackling black-box combinatorial optimization problems, specifically those categorized as NP. By leveraging a deep learning framework inspired by annealing processes, this research provides a method that is both sample-efficient and capable of delivering high-quality solutions.

Core Methodology

The central contribution of this research is the development of the Generative Neural Annealer (GNA), which utilizes a decoder-only transformer architecture to approximate a temperature-conditioned Boltzmann distribution. By treating the black-box objective as an energy function, the GNA learns to navigate the energy landscape effectively, adapting from a near-uniform distribution at high temperatures to a focused one at low temperatures. This transition allows the model to identify global optima by learning complex interactions among variables, thus "opening" the black box when queries to the objective function are limited or costly.

Empirical Evaluation

The GNA framework was assessed across several well-established combinatorial optimization benchmarks, including Ising sparsification, contamination control, 3-SAT, 3-XORSAT, and subset sum problems. The results demonstrated GNA's robust competitive edge against state-of-the-art black-box optimizers. Notably, the model displayed its strength in problems with expansive and rugged energy landscapes, where traditional methods like simulated annealing and Bayesian optimization often falter due to high query demands and computational overhead.

Key Contributions and Findings

The paper emphasizes three major contributions:

  1. Temperature-Parameterized Boltzmann Distribution: GNA pioneers a black-box optimization strategy that adapts based on temperature, facilitating efficient exploration and exploitation of solution spaces.
  2. Versatile Training Regimes: The paper introduces two training strategies tailored for different scenarios—one for settings with restricted queries and another for abundant queries. Both regimes leverage the model's ability to capture the structural nuances of the combinatorial landscape without domain-specific knowledge.
  3. Capturing Variable Interactions: Empirical results indicate that GNA intrinsically learns the interaction between variables, circumventing the need for bespoke problem-structuring or explicit involvement of domain-specific constraints.

Implications and Future Directions

The implications of this work extend to both theoretical and practical spheres. Theoretically, the incorporation of deep networks in modeling complex optimization landscapes underscores a promising direction for solving NP-hard problems. Practically, the method's applicability is significant for fields where direct access to the problem's structure is limited or when function evaluations are costly. Such scenarios are common in meta-heuristics and industrial applications, where efficiency and quality of solutions are critical.

In future developments, potential exists for integrating GNA with existing solvers, using it to model optimization trajectories of established algorithms, or extending it to dynamic environments where constraints evolve over time. Additionally, exploring regularization techniques to enhance model robustness and mitigate overfitting—particularly in limited-query regimes—remains an open challenge.

Conclusion

Overall, Zhang and Di Ventra’s work on the Generative Neural Annealer offers an adept framework for addressing black-box combinatorial optimization, advancing both the repertoire and efficacy of artificial intelligence in handling complex, high-dimensional problems. The research not only exhibits strong empirical results but also sets a foundation for further exploration into the convergence of deep learning and optimization theory.