Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Almost Sure Convergence of Random Projected Proximal and Subgradient Algorithms for Distributed Nonsmooth Convex Optimization (1510.07107v1)

Published 24 Oct 2015 in math.OC

Abstract: Two distributed algorithms are described that enable all users connected over a network to cooperatively solve the problem of minimizing the sum of all users' objective functions over the intersection of all users' constraint sets, where each user has its own private nonsmooth convex objective function and closed convex constraint set, which is the intersection of a number of simple, closed convex sets. One algorithm enables each user to adjust its estimate by using a proximity operator of its objective function and the metric projection onto one set randomly selected from the simple, closed convex sets. The other is a distributed random projection algorithm that determines each user's estimate by using a subgradient of its objective function instead of the proximity operator. Investigation of the two algorithms' convergence properties for a diminishing step-size rule revealed that, under certain assumptions, the sequences of all users generated by each of the two algorithms converge almost surely to the same solution. Moreover, convergence rate analysis of the two algorithms is provided, and desired choices of the step size sequences such that the two algorithms have fast convergence are discussed. Numerical comparisons for concrete nonsmooth convex optimization support the convergence analysis and demonstrate the effectiveness of the two algorithms.

Summary

We haven't generated a summary for this paper yet.