Papers
Topics
Authors
Recent
2000 character limit reached

Kurdyka-Łojasiewicz exponent via Hadamard parametrization (2402.00377v2)

Published 1 Feb 2024 in math.OC

Abstract: We consider a class of $\ell_1$-regularized optimization problems and the associated smooth "over-parameterized" optimization problems built upon the Hadamard parametrization, or equivalently, the Hadamard difference parametrization (HDP). We characterize the set of second-order stationary points of the HDP-based model and show that they correspond to some stationary points of the corresponding $\ell_1$-regularized model. More importantly, we show that the Kurdyka-Lojasiewicz (KL) exponent of the HDP-based model at a second-order stationary point can be inferred from that of the corresponding $\ell_1$-regularized model under suitable assumptions. Our assumptions are general enough to cover a wide variety of loss functions commonly used in $\ell_1$-regularized models, such as the least squares loss function and the logistic loss function. Since the KL exponents of many $\ell_1$-regularized models are explicitly known in the literature, our results allow us to leverage these known exponents to deduce the KL exponents at second-order stationary points of the corresponding HDP-based models, which were previously unknown. Finally, we demonstrate how these explicit KL exponents at second-order stationary points can be applied to deducing the explicit local convergence rate of a standard gradient descent method for minimizing the HDP-based model.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.