Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 86 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 43 tok/s
GPT-5 High 37 tok/s Pro
GPT-4o 98 tok/s
GPT OSS 120B 466 tok/s Pro
Kimi K2 225 tok/s Pro
2000 character limit reached

Average-Case Matrix Discrepancy: Asymptotics and Online Algorithms (2307.10055v2)

Published 19 Jul 2023 in cs.DS, cs.DM, math.CO, and math.PR

Abstract: We study the operator norm discrepancy of i.i.d. random matrices, initiating the matrix-valued analog of a long line of work on the $\ell{\infty}$ norm discrepancy of i.i.d. random vectors. First, using repurposed results on vector discrepancy and new first moment method calculations, we give upper and lower bounds on the discrepancy of random matrices. We treat i.i.d. matrices drawn from the Gaussian orthogonal ensemble (GOE) and low-rank Gaussian Wishart distributions. In both cases, for what turns out to be the "critical" number of $\Theta(n2)$ matrices of dimension $n \times n$, we identify the discrepancy up to constant factors. Second, we give a new analysis of the matrix hyperbolic cosine algorithm of Zouzias (2011), a matrix version of an online vector discrepancy algorithm of Spencer (1977) studied for average-case inputs by Bansal and Spencer (2020), for the case of i.i.d. random matrix inputs. We both give a general analysis and extract concrete bounds on the discrepancy achieved by this algorithm for matrices with independent entries (including GOE matrices) and Gaussian Wishart matrices.

Citations (1)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.