Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hypergraph Removal Lemmas via Robust Sharp Threshold Theorems (1804.00328v3)

Published 1 Apr 2018 in math.CO

Abstract: The classical sharp threshold theorem of Friedgut and Kalai (1996) asserts that any symmetric monotone function $f:{0,1}{n}\to{0,1}$ exhibits a sharp threshold phenomenon. This means that the expectation of $f$ with respect to the biased measure $\mu_{p}$ increases rapidly from 0 to 1 as $p$ increases. In this paper we present robust' versions of the theorem, which assert that it holds also if the function isalmost' monotone, and admits a much weaker notion of symmetry. Unlike the original proof of the theorem which relies on hypercontractivity, our proof relies on a regularity' lemma (of the class of Szemer\'edi's regularity lemma and its generalizations) and on theinvariance principle' of Mossel, O'Donnell, and Oleszkiewicz which allows (under certain conditions) replacing functions on the cube ${0,1}{n}$ with functions on Gaussian random variables. The hypergraph removal lemma of Gowers (2007) and independently of Nagle, R\"odl, Schacht, and Skokan (2006) says that if a $k$-uniform hypergraph on $n$ vertices contains few copies of a fixed hypergraph $H$, then it can be made $H$-free by removing few of its edges. While this settles the `hypergraph removal problem' in the case where $k$ and $H$ are fixed, the result is meaningless when $k$ is large (e.g. $k>\log\log\log n$). Using our robust version of the Friedgut-Kalai Theorem, we obtain a hypergraph removal lemma that holds for $k$ up to linear in $n$ for a large class of hypergraphs. These contain all the hypergraphs such that both their number of edges and the sizes of the intersections of pairs of their edges are upper bounded by some constant.

Summary

We haven't generated a summary for this paper yet.