Random coordinate descent methods for nonseparable composite optimization (2203.14368v2)
Abstract: In this paper we consider large-scale composite optimization problems having the objective function formed as a sum of two terms (possibly nonconvex), one has (block) coordinate-wise Lipschitz continuous gradient and the other is differentiable but nonseparable. Under these general settings we derive and analyze two new coordinate descent methods. The first algorithm, referred to as coordinate proximal gradient method, considers the composite form of the objective function, while the other algorithm disregards the composite form of the objective and uses the partial gradient of the full objective, yielding a coordinate gradient descent scheme with novel adaptive stepsize rules. We prove that these new stepsize rules make the coordinate gradient scheme a descent method, provided that additional assumptions hold for the second term in the objective function. We present a complete worst-case complexity analysis for these two new methods in both, convex and nonconvex settings, provided that the (block) coordinates are chosen random or cyclic. Preliminary numerical results also confirm the efficiency of our two algorithms on practical problems.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.