Decentralised convex optimisation with probability-proportional-to-size quantization
Abstract: Communication is one of the bottlenecks of distributed optimisation and learning. To overcome this bottleneck, we propose a novel quantization method that transforms a vector into a sample of components' indices drawn from a categorical distribution with probabilities proportional to values at those components. Then, we propose a primal and a primal-dual accelerated stochastic gradient methods that use our proposed quantization, and derive their convergence rates in terms of probabilities of large deviations. We focus on affine-constrained convex optimisation and its application to decentralised distributed optimisation problems. To illustrate the work of our algorithm, we apply it to the decentralised computation of semi-discrete entropy regularized Wasserstein barycenters.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.