A family of spectral gradient methods for optimization (1812.02974v1)
Abstract: We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the long Barzilai-Borwein (BB) stepsize and the short BB stepsize. Each member of the family is shown to share certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is $R$-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is $R$-linearly convergent in the any-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.