Linear Convergence of Variable Bregman Stochastic Coordinate Descent Method for Nonsmooth Nonconvex Optimization by Level-set Variational Analysis
Abstract: Large-scale nonconvex and nonsmooth problems have attracted considerable attention in the fields of compress sensing, big data optimization and machine learning. Exploring effective methods is still the main challenge of today's research. Stochastic coordinate descent type methods have been widely used to solve large-scale optimization problems. In this paper, we derive the convergence of variable Bregman stochastic coordinate descent (VBSCD) method for a broad class of nonsmooth and nonconvex optimization problems, i.e., any accumulation of the sequence generated by VBSCD is almost surely a critical point. Moreover, we develop a new variational approach on level sets that aim towards the convergence rate analysis. If the level-set subdifferential error bound holds, we derive a linear rate of convergence for the expected values of the objective function and expected values of random variables generated by VBSCD.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.