Inexact Newton-CG Algorithms With Complexity Guarantees (2109.14016v2)
Abstract: We consider variants of a recently-developed Newton-CG algorithm for nonconvex problems \citep{royer2018newton} in which inexact estimates of the gradient and the Hessian information are used for various steps. Under certain conditions on the inexactness measures, we derive iteration complexity bounds for achieving $\epsilon$-approximate second-order optimality that match best-known lower bounds. Our inexactness condition on the gradient is adaptive, allowing for crude accuracy in regions with large gradients. We describe two variants of our approach, one in which the step-size along the computed search direction is chosen adaptively and another in which the step-size is pre-defined. To obtain second-order optimality, our algorithms will make use of a negative curvature direction on some steps. These directions can be obtained, with high-probability, using a certain randomized algorithm. In this sense, all of our results hold with high-probability over the run of the algorithm. We evaluate the performance of our proposed algorithms empirically on several machine learning models.