Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Hardness Assumptions Needed for "Extreme High-End'' PRGs and Fast Derandomization (2311.11663v1)

Published 20 Nov 2023 in cs.CC

Abstract: The hardness vs.~randomness paradigm aims to explicitly construct pseudorandom generators $G:{0,1}r \rightarrow {0,1}m$ that fool circuits of size $m$, assuming the existence of explicit hard functions. A high-end PRG'' with seed length $r=O(\log m)$ (implying BPP=P) was achieved in a seminal work of Impagliazzo and Wigderson (STOC 1997), assuming the high-end hardness assumption: there exist constants $0<\beta < 1< B$, and functions computable in time $2^{B \cdot n}$ that cannot be computed by circuits of size $2^{\beta \cdot n}$. Recently, motivated by fast derandomization of randomized algorithms, Doron et al.~(FOCS 2020) and Chen and Tell (STOC 2021), constructextreme high-end PRGs'' with seed length $r=(1+o(1))\cdot \log m$, under qualitatively stronger assumptions. We study whether extreme high-end PRGs can be constructed from the following scaled version of the assumption which we call ``the extreme high-end hardness assumption'', and in which $\beta=1-o(1)$ and $B=1+o(1)$. We give a partial negative answer, showing that certain approaches cannot yield a black-box proof. (A longer abstract with more details appears in the PDF file)

Citations (10)

Summary

We haven't generated a summary for this paper yet.