Papers
Topics
Authors
Recent
2000 character limit reached

A family of spectral gradient methods for optimization (1812.02974v1)

Published 7 Dec 2018 in math.OC

Abstract: We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the long Barzilai-Borwein (BB) stepsize and the short BB stepsize. Each member of the family is shown to share certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is $R$-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is $R$-linearly convergent in the any-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising.

Citations (50)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.