A Simple Adaptive Step-size Choice for Iterative Optimization Methods (1802.00339v2)
Abstract: We suggest a simple adaptive step-size procedure, which does not require any line-search, for a general class of nonlinear optimization methods and prove convergence of a general method under mild assumptions. In particular, the goal function may be non-smooth and non-convex. Unlike the descent line-search methods, it does not require monotone decrease of the goal function values along the iteration points and reduces the implementation cost of each iteration essentially. The key element of this procedure consists in inserting a majorant step-size sequence such that the next element is taken only if the current iterate does not give a sufficient descent. Its applications yield in particular a new gradient projection method for smooth constrained optimization problems and a new projection type method for minimization of the gap function of a general variational inequality. Preliminary results of computational experiments confirm efficiency of the proposed modification.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.