Accelerating Frank-Wolfe Algorithm using Low-Dimensional and Adaptive Data Structures (2207.09002v1)
Abstract: In this paper, we study the problem of speeding up a type of optimization algorithms called Frank-Wolfe, a conditional gradient method. We develop and employ two novel inner product search data structures, improving the prior fastest algorithm in [Shrivastava, Song and Xu, NeurIPS 2021]. * The first data structure uses low-dimensional random projection to reduce the problem to a lower dimension, then uses efficient inner product data structure. It has preprocessing time $\tilde O(nd{\omega-1}+dn{1+o(1)})$ and per iteration cost $\tilde O(d+n\rho)$ for small constant $\rho$. * The second data structure leverages the recent development in adaptive inner product search data structure that can output estimations to all inner products. It has preprocessing time $\tilde O(nd)$ and per iteration cost $\tilde O(d+n)$. The first algorithm improves the state-of-the-art (with preprocessing time $\tilde O(d2n{1+o(1)})$ and per iteration cost $\tilde O(dn\rho)$) in all cases, while the second one provides an even faster preprocessing time and is suitable when the number of iterations is small.