Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Near-linear Size Hypergraph Cut Sparsifiers (2009.04992v1)

Published 10 Sep 2020 in cs.DS

Abstract: Cuts in graphs are a fundamental object of study, and play a central role in the study of graph algorithms. The problem of sparsifying a graph while approximately preserving its cut structure has been extensively studied and has many applications. In a seminal work, Bencz\'ur and Karger (1996) showed that given any $n$-vertex undirected weighted graph $G$ and a parameter $\varepsilon \in (0,1)$, there is a near-linear time algorithm that outputs a weighted subgraph $G'$ of $G$ of size $\tilde{O}(n/\varepsilon2)$ such that the weight of every cut in $G$ is preserved to within a $(1 \pm \varepsilon)$-factor in $G'$. The graph $G'$ is referred to as a {\em $(1 \pm \varepsilon)$-approximate cut sparsifier} of $G$. A natural question is if such cut-preserving sparsifiers also exist for hypergraphs. Kogan and Krauthgamer (2015) initiated a study of this question and showed that given any weighted hypergraph $H$ where the cardinality of each hyperedge is bounded by $r$, there is a polynomial-time algorithm to find a $(1 \pm \varepsilon)$-approximate cut sparsifier of $H$ of size $\tilde{O}(\frac{nr}{\varepsilon2})$. Since $r$ can be as large as $n$, in general, this gives a hypergraph cut sparsifier of size $\tilde{O}(n2/\varepsilon2)$, which is a factor $n$ larger than the Bencz\'ur-Karger bound for graphs. It has been an open question whether or not Bencz\'ur-Karger bound is achievable on hypergraphs. In this work, we resolve this question in the affirmative by giving a new polynomial-time algorithm for creating hypergraph sparsifiers of size $\tilde{O}(n/\varepsilon2)$.

Citations (26)

Summary

We haven't generated a summary for this paper yet.