An optimization derivation of the method of conjugate gradients (2011.02337v3)
Abstract: We give a derivation of the method of conjugate gradients based on the requirement that each iterate minimizes a strictly convex quadratic on the space spanned by the previously observed gradients. Rather than verifying that the search direction has the correct properties, we show that generation of such iterates is equivalent to generation of orthogonal gradients which gives the description of the direction and the step length. Our approach gives a straightforward way to see that the search direction of the method of conjugate gradients is a negative scalar times the gradient of minimum Euclidean norm evaluated on the affine span of the iterates generated so far.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.