Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Smoothed Analysis of the 2-Opt Heuristic for the TSP under Gaussian Noise (2308.00306v1)

Published 1 Aug 2023 in cs.DS

Abstract: The 2-opt heuristic is a very simple local search heuristic for the traveling salesperson problem. In practice it usually converges quickly to solutions within a few percentages of optimality. In contrast to this, its running-time is exponential and its approximation performance is poor in the worst case. Englert, R\"oglin, and V\"ocking (Algorithmica, 2014) provided a smoothed analysis in the so-called one-step model in order to explain the performance of 2-opt on d-dimensional Euclidean instances, both in terms of running-time and in terms of approximation ratio. However, translating their results to the classical model of smoothed analysis, where points are perturbed by Gaussian distributions with standard deviation sigma, yields only weak bounds. We prove bounds that are polynomial in n and 1/sigma for the smoothed running-time with Gaussian perturbations. In addition, our analysis for Euclidean distances is much simpler than the existing smoothed analysis. Furthermore, we prove a smoothed approximation ratio of O(log(1/sigma)). This bound is almost tight, as we also provide a lower bound of Omega(log n/ loglog n) for sigma = O(1/sqrt n). Our main technical novelty here is that, different from existing smoothed analyses, we do not separately analyze objective values of the global and local optimum on all inputs (which only allows for a bound of O(1/sigma)), but simultaneously bound them on the same input.

Citations (1)

Summary

We haven't generated a summary for this paper yet.