An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization (2209.09119v5)
Abstract: This paper focuses on the minimization of a sum of a twice continuously differentiable function $f$ and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of $f$ involving the $\varrho$th power of the KKT residual. For $\varrho=0$, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent $1/2$. For $\varrho\in(0,1)$, by assuming that cluster points satisfy a locally H\"{o}lderian error bound of order $q$ on a second-order stationary point set and a local error bound of order $q>1!+!\varrho$ on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on $q$ and $\varrho$. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on $\ell_1$-regularized Student's $t$-regressions, group penalized Student's $t$-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.