Papers
Topics
Authors
Recent
Search
2000 character limit reached

Optimal Smoothed Analysis of the Simplex Method

Published 5 Apr 2025 in cs.DS | (2504.04197v1)

Abstract: Smoothed analysis is a method for analyzing the performance of algorithms, used especially for those algorithms whose running time in practice is significantly better than what can be proven through worst-case analysis. Spielman and Teng (STOC '01) introduced the smoothed analysis framework of algorithm analysis and applied it to the simplex method. Given an arbitrary linear program with $d$ variables and $n$ inequality constraints, Spielman and Teng proved that the simplex method runs in time $O(\sigma{-30} d{55} n{86})$, where $\sigma > 0$ is the standard deviation of Gaussian distributed noise added to the original LP data. Spielman and Teng's result was simplified and strengthened over a series of works, with the current strongest upper bound being $O(\sigma{-3/2} d{13/4} \log(n){7/4})$ pivot steps due to Huiberts, Lee and Zhang (STOC '23). We prove that there exists a simplex method whose smoothed complexity is upper bounded by $O(\sigma{-1/2} d{11/4} \log(n){7/4})$ pivot steps. Furthermore, we prove a matching high-probability lower bound of $\Omega( \sigma{-1/2} d{1/2}\ln(4/\sigma){-1/4})$ on the combinatorial diameter of the feasible polyhedron after smoothing, on instances using $n = \lfloor (4/\sigma)d \rfloor$ inequality constraints. This lower bound indicates that our algorithm has optimal noise dependence among all simplex methods, up to polylogarithmic factors.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.