Conjugate Gradient Method and its Extensions
No Thumbnail Available
Date
2018-05-01
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Addis Ababa University
Abstract
In this project we investigate conjugate gradient method and its extension to solve unconstrained
minimization problems. There are two important methods for solving linear equations
and nonlinear optimization problems. The performance of the linear conjugate gradient
method is tied to the distribution of the eigenvalues of the coe cient matrix. Nonlinear conjugate
gradient method is used for solving large-scale nonlinear optimization problems and
has wide applications in many elds. It is also discussed how to use the result to obtain the
convergence of the famous Fletcher-Reeves, and Polak-Ribiere conjugate gradient methods.
And comparisons are made among the algorithms of the steepest descent method, Newton's
method and conjugate gradient method for quadratic and nonquadratic problems.
Description
Keywords
Preliminaries, Rate of Convergence, Krylov Subspaces