Coordinate projected gradient descent minimization and its application to orthogonal nonnegative matrix factorization (2504.00770v1)
Abstract: In this paper we consider large-scale composite nonconvex optimization problems having the objective function formed as a sum of three terms, first has block coordinate-wise Lipschitz continuous gradient, second is twice differentiable but nonseparable and third is the indicator function of some separable closed convex set. Under these general settings we derive and analyze a new cyclic coordinate descent method, which uses the partial gradient of the differentiable part of the objective, yielding a coordinate gradient descent scheme with a novel adaptive stepsize rule. We prove that this stepsize rule makes the coordinate gradient scheme a descent method, provided that additional assumptions hold for the second term in the objective function. We also present a worst-case complexity analysis for this new method in the nonconvex settings. Numerical results on orthogonal nonnegative matrix factorization problem also confirm the efficiency of our algorithm.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.