On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces (1911.00404v1)
Abstract: In this paper, the convergence of alternating minimization is established for non-smooth convex optimization in Banach spaces, and novel rates of convergence are provided. As objective function a composition of a smooth and a non-smooth part is considered with the latter being block-separable, e.g., corresponding to convex constraints or regularization. For the smooth part, three different relaxations of strong convexity are considered: (i) quasi-strong convexity; (ii) quadratic functional growth; and (iii) plain convexity. Linear convergence is established for the first two cases, generalizing and improving previous results for strongly convex problems; sublinear convergence is established for the third case, also improving previous results from the literature. All the convergence results have in common, that opposing to previous corresponding results for the general block coordinate descent, the performance of the alternating minimization is beneficially governed by properties of the single blocks, instead of global properties. Ultimately, not only the better conditioned block determines the performance, as has been similarly observed in the literature. But also the worse conditioned problem enhances the performance additionally, resulting in potentially significantly improved convergence rates. Furthermore, by solely using the convexity and smoothness properties of the problem, the results immediately apply in general Banach spaces.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.