Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Convex Optimization Requires Superlinear Memory (2203.15260v2)

Published 29 Mar 2022 in cs.LG, cs.CC, cs.DS, math.OC, and stat.ML

Abstract: We show that any memory-constrained, first-order algorithm which minimizes $d$-dimensional, $1$-Lipschitz convex functions over the unit ball to $1/\mathrm{poly}(d)$ accuracy using at most $d{1.25 - \delta}$ bits of memory must make at least $\tilde{\Omega}(d{1 + (4/3)\delta})$ first-order queries (for any constant $\delta \in [0, 1/4]$). Consequently, the performance of such memory-constrained algorithms are a polynomial factor worse than the optimal $\tilde{O}(d)$ query bound for this problem obtained by cutting plane methods that use $\tilde{O}(d2)$ memory. This resolves a COLT 2019 open problem of Woodworth and Srebro.

Citations (14)

Summary

We haven't generated a summary for this paper yet.