2000 character limit reached
Accelerating Diffusion-based Combinatorial Optimization Solvers by Progressive Distillation (2308.06644v2)
Published 12 Aug 2023 in cs.LG and cs.AI
Abstract: Graph-based diffusion models have shown promising results in terms of generating high-quality solutions to NP-complete (NPC) combinatorial optimization (CO) problems. However, those models are often inefficient in inference, due to the iterative evaluation nature of the denoising diffusion process. This paper proposes to use progressive distillation to speed up the inference by taking fewer steps (e.g., forecasting two steps ahead within a single step) during the denoising process. Our experimental results show that the progressively distilled model can perform inference 16 times faster with only 0.019% degradation in performance on the TSP-50 dataset.
- Junwei Huang (12 papers)
- Zhiqing Sun (35 papers)
- Yiming Yang (151 papers)