Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deterministic Discrepancy Minimization via the Multiplicative Weight Update Method (1611.08752v3)

Published 26 Nov 2016 in cs.DM, cs.CG, cs.DS, and math.CO

Abstract: A well-known theorem of Spencer shows that any set system with $n$ sets over $n$ elements admits a coloring of discrepancy $O(\sqrt{n})$. While the original proof was non-constructive, recent progress brought polynomial time algorithms by Bansal, Lovett and Meka, and Rothvoss. All those algorithms are randomized, even though Bansal's algorithm admitted a complicated derandomization. We propose an elegant deterministic polynomial time algorithm that is inspired by Lovett-Meka as well as the Multiplicative Weight Update method. The algorithm iteratively updates a fractional coloring while controlling the exponential weights that are assigned to the set constraints. A conjecture by Meka suggests that Spencer's bound can be generalized to symmetric matrices. We prove that $n \times n$ matrices that are block diagonal with block size $q$ admit a coloring of discrepancy $O(\sqrt{n} \cdot \sqrt{\log(q)})$. Bansal, Dadush and Garg recently gave a randomized algorithm to find a vector $x$ with entries in $\lbrace{-1,1\rbrace}$ with $|Ax|_{\infty} \leq O(\sqrt{\log n})$ in polynomial time, where $A$ is any matrix whose columns have length at most 1. We show that our method can be used to deterministically obtain such a vector.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Avi Levy (12 papers)
  2. Harishchandra Ramadas (3 papers)
  3. Thomas Rothvoss (42 papers)
Citations (57)

Summary

We haven't generated a summary for this paper yet.