Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Embedding Probability Distributions into Low Dimensional $\ell_1$: Tree Ising Models via Truncated Metrics (2312.02435v2)

Published 5 Dec 2023 in cs.DS

Abstract: Given an arbitrary set of high dimensional points in $\ell_1$, there are known negative results that preclude the possibility of always mapping them to a low dimensional $\ell_1$ space while preserving distances with small multiplicative distortion. This is in stark contrast with dimension reduction in Euclidean space ($\ell_2$) where such mappings are always possible. While the first non-trivial lower bounds for $\ell_1$ dimension reduction were established almost 20 years ago, there has been limited progress in understanding what sets of points in $\ell_1$ are conducive to a low-dimensional mapping. In this work, we study a new characterization of $\ell_1$ metrics that are conducive to dimension reduction in $\ell_1$. Our characterization focuses on metrics that are defined by the disagreement of binary variables over a probability distribution -- any $\ell_1$ metric can be represented in this form. We show that, for configurations of $n$ points in $\ell_1$ obtained from tree Ising models, we can reduce dimension to $\mathrm{polylog}(n)$ with constant distortion. In doing so, we develop technical tools for embedding truncated metrics which have been studied because of their applications in computer vision, and are objects of independent interest in metric geometry. Among other tools, we show how any $\ell_1$ metric can be truncated with $O(1)$ distortion and $O(\log(n))$ blowup in dimension.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com