Asynchronous Parallel Nonconvex Optimization Under the Polyak-Lojasiewicz Condition
Abstract: Communication delays and synchronization are major bottlenecks for parallel computing, and tolerating asynchrony is therefore crucial for accelerating parallel computation. Motivated by optimization problems that do not satisfy convexity assumptions, we present an asynchronous block coordinate descent algorithm for nonconvex optimization problems whose objective functions satisfy the Polyak-Lojasiewicz condition. This condition is a generalization of strong convexity to nonconvex problems and requires neither convexity nor uniqueness of minimizers. Under only assumptions of mild smoothness of objective functions and bounded delays, we prove that a linear convergence rate is obtained. Numerical experiments for logistic regression problems are presented to illustrate the impact of asynchrony upon convergence.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.