Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Improved Gradient Method with Approximately Optimal Stepsize Based on Conic model for Unconstrained Optimization

Published 22 Jul 2019 in math.OC | (1907.10494v1)

Abstract: A new type of stepsize, which was recently introduced by Liu and Liu (Optimization, 67(3), 427-440, 2018), is called approximately optimal stepsize and is quit efficient for gradient method. Interestingly, all gradient methods can be regarded as gradient methods with approximately optimal stepsizes. In this paper, based on the work (Numer. Algorithms 78(1), 21-39, 2018), we present an improved gradient method with approximately optimal stepsize based on conic model for unconstrained optimization. If the objective function $ f $ is not close to a quadratic on the line segment between the current and latest iterates, we construct a conic model to generate approximately optimal stepsize for gradient method if the conic model can be used; otherwise, we construct some quadratic models to generate approximately optimal stepsizes for gradient method. The convergence of the proposed method is analyzed under suitable conditions. Numerical comparisons with some well-known conjugate gradient software packages such as CG$ _ $DESCENT (SIAM J. Optim. 16(1), 170-192, 2005) and CGOPT (SIAM J. Optim. 23(1), 296-320, 2013) indicate the proposed method is very promising.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.