Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mini-batch stochastic three-operator splitting for distributed optimization (2203.04020v1)

Published 8 Mar 2022 in math.OC and cs.MA

Abstract: We consider a network of agents, each with its own private cost consisting of a sum of two possibly nonsmooth convex functions, one of which is composed with a linear operator. At every iteration each agent performs local calculations and can only communicate with its neighbors. The challenging aspect of our study is that the smooth part of the private cost function is given as an expected value and agents only have access to this part of the problem formulation via a heavy-tailed stochastic oracle. To tackle such sampling-based optimization problems, we propose a stochastic extension of the triangular pre-conditioned primal-dual algorithm. We demonstrate almost sure convergence of the scheme and validate the performance of the method via numerical experiments.

Citations (1)

Summary

We haven't generated a summary for this paper yet.