Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Global convergence and acceleration of projection methods for feasibility problems involving union convex sets (2202.10052v4)

Published 21 Feb 2022 in math.OC

Abstract: We prove global convergence of classical projection algorithms for feasibility problems involving union convex sets, which refer to sets expressible as the union of a finite number of closed convex sets. We present a unified strategy for analyzing global convergence by means of studying fixed-point iterations of a set-valued operator that is the union of a finite number of compact-valued upper semicontinuous maps. Such a generalized framework permits the analysis of a class of proximal algorithms for minimizing the sum of a piecewise smooth function and the difference between pointwise minimum of finitely many weakly convex functions and a piecewise smooth convex function. When realized on two-set feasibility problems, this algorithm class recovers alternating projections and averaged projections as special cases, and thus we obtain global convergence criterion for these projection algorithms. Using these general results, we derive sufficient conditions to guarantee global convergence for several projection algorithms for solving the sparse affine feasibility problem and a feasibility reformulation of the linear complementarity problem. Notably, we obtain global convergence of both the alternating and the averaged projection methods to the solution set for linear complementarity problems involving $P$-matrices. By leveraging the structures of the classes of problems we consider, we also propose acceleration algorithms with guaranteed global convergence. Numerical results further exemplify that the proposed acceleration schemes significantly improve upon their non-accelerated counterparts in efficiency.

Summary

We haven't generated a summary for this paper yet.