Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Barriers for Faster Dimensionality Reduction (2207.03304v1)

Published 7 Jul 2022 in cs.DS

Abstract: The Johnson-Lindenstrauss transform allows one to embed a dataset of $n$ points in $\mathbb{R}d$ into $\mathbb{R}m,$ while preserving the pairwise distance between any pair of points up to a factor $(1 \pm \varepsilon)$, provided that $m = \Omega(\varepsilon{-2} \lg n)$. The transform has found an overwhelming number of algorithmic applications, allowing to speed up algorithms and reducing memory consumption at the price of a small loss in accuracy. A central line of research on such transforms, focus on developing fast embedding algorithms, with the classic example being the Fast JL transform by Ailon and Chazelle. All known such algorithms have an embedding time of $\Omega(d \lg d)$, but no lower bounds rule out a clean $O(d)$ embedding time. In this work, we establish the first non-trivial lower bounds (of magnitude $\Omega(m \lg m)$) for a large class of embedding algorithms, including in particular most known upper bounds.

Citations (1)

Summary

We haven't generated a summary for this paper yet.